Principle of Maximum Entropy

Size: px
Start display at page:

Download "Principle of Maximum Entropy"

Transcription

1 Chapter 9 Prncple of Maxmum Entropy Secton 8.2 presented the technque of estmatng nput probabltes of a process that are unbased but consstent wth known constrants expressed n terms of averages, or expected values, of one or more quanttes. Ths technque, the Prncple of Maxmum Entropy, was developed there for the smple case of one constrant and three nput events, n whch case the technque can be carred out analytcally. It s descrbed here for the more general case. 9. Problem Setup Before the Prncple of Maxmum Entropy can be used the problem doman needs to be set up. In cases nvolvng physcal systems, ths means that the varous states n whch the system can exst need to be dentfed, and all the parameters nvolved n the constrants known. For example, the energy, electrc charge, and other quanttes assocated wth each of the quantum states s assumed known. It s not assumed n ths step whch partcular state the system s actually n (whch state s occuped ). Indeed t s assumed that we cannot ever know ths wth certanty, and so we deal nstead wth the probablty of each of the states beng occuped. In applcatons to nonphyscal systems, the varous possble events have to be enumerated and the propertes of each determned, partcularly the values assocated wth each of the constrants. In ths Chapter we wll apply the general mathematcal dervaton to two examples, one a busness model, and the other a model of a physcal system (both very smple and crude). 9.. Berger s Burgers Ths example was used n Chapter 8 to deal wth nference and the analytc form of the Prncple of Maxmum Entropy. A fast-food restaurant offers three meals: burger, chcken, and fsh. Now we suppose that the menu has been extended to nclude a gourmet low-fat tofu meal. The prce, Calore count, and probablty of each meal beng delvered cold are lsted n Table Magnetc Dpole Model An array of magnetc dpoles (thnk of them as tny magnets) are subjected to an externally appled magnetc feld H and therefore the energy of the system depends on ther orentatons and on the appled feld. For smplcty our system contans only one such dpole, whch from tme to tme s able to nterchange nformaton and energy wth ether of two envronments, whch are much larger collectons of dpoles. Each 04

2 9. Problem Setup 05 Item Entree Cost Calores Probablty Probablty of arrvng hot of arrvng cold Meal Burger $ Meal 2 Chcken $ Meal 3 Fsh $ Meal 4 Tofu $ Table 9.: Berger s Burgers Left Envronment System Rght Envronment H Fgure 9.: Dpole moment example. (Each dpole can be ether up or down.) dpole, both n the system and n ts two envronments, can be ether up or down. The system has one dpole so t only has two states, correspondng to the two states for that dpole, up and down (f the system had n dpoles t would have 2 n states). The energy of each dpole s proportonal to the appled feld and depends on ts orentaton; the energy of the system s the sum of the energes of all the dpoles n the system, n our case only one such. State Algnment Energy U D up down m d H m d H Table 9.2: Magnetc Dpole Moments The constant m d s expressed n Joules per Tesla, and ts value depends on the physcs of the partcular dpole. For example, the dpoles mght be electron spns, n whch case m d = 2µ B µ 0 where µ 0 = 4π 0 7 henres per meter (n ratonalzed MKS unts) s the permeablty of free space, µ B = e/2m e = Joules per Tesla s the Bohr magneton, and where = h/2π, h = Joule-seconds s Plank s constant, e = coulombs s the magntude of the charge of an electron, and m e = klograms s the rest mass of an electron. In Fgure 9., the system s shown between two envronments, and there are barrers between the envronments and the system (represented by vertcal lnes) whch prevent nteracton (later we wll remove the barrers to permt nteracton). The dpoles, n both the system and the envronments, are represented by the symbol and may be ether spn-up or spn-down. The magnetc feld shown s appled to the system only, not to the envronments. The vrtue of a model wth only one dpole s that t s smple enough that the calculatons can be carred out easly. Such a model s, of course, hopelessly smplstc and cannot be expected to lead to numercally accurate results. A more realstc model would requre so many dpoles and so many states that practcal computatons on the collecton could never be done. For example, a mole of a chemcal element s a small amount by everyday standards, but t contans Avogadro s number N A = of atoms, and a correspondngly large number of electron spns; the number of possble states would be 2 rased to that power. Just how large ths number s can be apprecated by notng that the earth contans no more than 2 70 atoms, and the vsble unverse has about atoms; both of these numbers are way less than the number of states n that model. Even f we are less ambtous and want to compute wth a much smaller sample, say

3 9.2 Probabltes spns, and want to represent n our computer the probablty of each state (usng only 8 bts per state), we would stll need more bytes of memory than there are atoms n the earth. Clearly t s mpossble to compute wth so many states, so the technques descrbed n these notes cannot be carred through n detal. Nevertheless there are certan conclusons and general relatonshps we wll be able to establsh. 9.2 Probabltes Although the problem has been set up, we do not know whch actual state the system s n. To express what we do know despte ths gnorance, or uncertanty, we assume that each of the possble states A has some probablty of occupancy p(a ) where s an ndex runnng over the possble states. A probablty dstrbuton p(a ) has the property that each of the probabltes s between 0 and (possbly beng equal to ether 0 or ), and (snce the nput events are mutually exclusve and exhaustve) the sum of all the probabltes s : = p(a ) (9.) As has been mentoned before, two observers may, because of ther dfferent knowledge, use dfferent probablty dstrbutons. In other words, probablty, and all quanttes that are based on probabltes, are subjectve, or observer-dependent. The dervatons below can be carred out for any observer. 9.3 Entropy Our uncertanty s expressed quanttatvely by the nformaton whch we do not have about the state occuped. Ths nformaton s S = p(a ) log 2 (9.2) p(a ) Informaton s measured n bts, as a consequence of the use of logarthms to base 2 n the Equaton 9.2. In dealng wth real physcal systems, wth a huge number of states and therefore an entropy that s a very large number of bts, t s convenent to multply the summaton above by Boltzmann s constant k B = Joules per Kelvn, and also use natural logarthms rather than logarthms to base 2. Then S would be expressed n Joules per Kelvn: S = k B p(a ) ln (9.3) p(a ) In the context of both physcal systems and communcaton systems the uncertanty s known as the entropy. Note that because the entropy s expressed n terms of probabltes, t also depends on the observer, so two people wth dfferent knowledge of the system would calculate a dfferent numercal value for entropy. 9.4 Constrants The entropy has ts maxmum value when all probabltes are equal (we assume the number of possble states s fnte), and the resultng value for entropy s the logarthm of the number of states, wth a possble scale factor lke k B. If we have no addtonal nformaton about the system, then such a result seems reasonable. However, f we have addtonal nformaton n the form of constrants then the assumpton of equal probabltes would probably not be consstent wth those constrants. Our objectve s to fnd the probablty dstrbuton that has the greatest uncertanty, and hence s as unbased as possble. For smplcty we consder only one such constrant here. We assume that we know the expected value of some quantty (the Prncple of Maxmum Entropy can handle multple constrants but the mathematcal

4 9.5 Maxmum Entropy, Analytc Form 07 procedures and formulas are more complcated). The quantty n queston s one for whch each of the states of the system has ts own amount, and the expected value s found by averagng the values correspondng to each of the states, takng nto account the probabltes of those states. Thus f there s a quantty G for whch each of the states has a value g(a ) then we want to consder only those probablty dstrbutons for whch the expected value s a known value G G = p(a )g(a ) (9.4) Of course ths constrant cannot be acheved f G s less than the smallest g(a ) or greater than the largest g(a ) Examples For our Berger s Burgers example, suppose we are told that the average prce of a meal s $2.50, and we want to estmate the separate probabltes of the varous meals wthout makng any other assumptons. Then our constrant would be $2.50 = $.00p(B) + $2.00p(C) + $3.00p(F ) + $8.00p(T ) (9.5) For our magnetc-dpole example, assume the energes for states U and D are denoted e() where s ether U or D, and assume the expected value of the energy s known to be some value Ẽ. All these energes are expressed n Joules. Then Ẽ = e(u)p(u) + e(d)p(d) (9.6) The energes e(u) and e(d) depend on the externally appled magnetc feld H. Ths parameter, whch wll be carred through the dervaton, wll end up playng an mportant role. If the formulas for the e() from Table 9.2 are used here, 9.5 Maxmum Entropy, Analytc Form Ẽ = m d H[p(D) p(u)] (9.7) The Prncple of Maxmum Entropy s based on the premse that when estmatng the probablty dstrbuton, you should select that dstrbuton whch leaves you the largest remanng uncertanty (.e., the maxmum entropy) consstent wth your constrants. That way you have not ntroduced any addtonal assumptons or bases nto your calculatons. Ths prncple was used n Chapter 8 for the smple case of three probabltes and one constrant. The entropy could be maxmzed analytcally. Usng the constrant and the fact that the probabltes add up to, we expressed two of the unknown probabltes n terms of the thrd. Next, the possble range of values of the probabltes was determned usng the fact that each of the three les between 0 and. Then, these expressons were substtuted nto the formula for entropy S so that t was expressed n terms of a sngle probablty. Then any of several technques could be used to fnd the value of that probablty for whch S s the largest. Ths analytcal technque does not extend to cases wth more than three possble states and only one constrant. It s only practcal because the constrant can be used to express the entropy n terms of a sngle varable. If there are, say, four unknowns and two equatons, the entropy would be left as a functon of two varables, rather than one. It would be necessary to search for ts maxmum n a plane. Perhaps ths seems feasble, but what f there were fve unknowns? (Or ten?) Searchng n a space of three (or eght) dmensons would be necessary, and ths s much more dffcult. A dfferent approach s developed n the next secton, one well suted for a sngle constrant and many probabltes.

5 9.6 Maxmum Entropy, Sngle Constrant Maxmum Entropy, Sngle Constrant Let us assume the average value of some quantty wth values g(a ) assocated wth the varous events A s known; call t G (ths s the constrant). Thus there are two equatons, one of whch comes from the constrant and the other from the fact that the probabltes add up to : = p(a ) (9.8) G = p(a )g(a ) (9.9) where G cannot be smaller than the smallest g(a ) or larger than the largest g(a ). The entropy assocated wth ths probablty dstrbuton s S = p(a ) log 2 (9.0) p(a ) when expressed n bts. In the dervaton below ths formula for entropy wll be used. It works well for examples wth a small number of states. In later chapters of these notes we wll start usng the more common expresson for entropy n physcal systems, expressed n Joules per Kelvn, S = k B p(a ) ln (9.) p(a ) 9.6. Dual Varable Sometmes a problem s clarfed by lookng at a more general problem of whch the orgnal s a specal case. In ths case, rather than focusng on a specfc value of G, let s look at all possble values of G, whch means the range between the smallest and largest values of g(a ). Thus G becomes a varable rather than a known value (the known value wll contnue to be denoted G here). Then rather than express thngs n terms of G as an ndependent varable, we wll ntroduce a new dual varable, whch we wll call β, and express all the quanttes of nterest, ncludng G, n terms of t. Then the orgnal problem reduces to fndng the value of β whch corresponds to the known, desred value G,.e., the value of β for whch G(β) = G. The new varable β s known as a Lagrange Multpler, named after the French mathematcan Joseph- Lous Lagrange (736 83). Lagrange developed a general technque, usng such varables, to perform constraned maxmzaton, of whch our current problem s a very smple case. We wll not use the mathematcal technque of Lagrange Multplers t s more powerful and more complcated than we need. Here s what we wll do nstead. We wll start wth the answer, whch others have derved usng Lagrange Multplers, and prove that t s correct. That s, we wll gve a formula for the probablty dstrbuton p(a ) n terms of the β and the g(a ) parameters, and then prove that the entropy calculated from ths dstrbuton, S(β) s at least as large as the entropy of any probablty dstrbuton that has the same expected value for G, namely G(β). Therefore the use of β automatcally maxmzes the entropy. Then we wll show how to fnd the value of β, and therefore ndrectly all the quanttes of nterest, for the partcular value G of nterest (ths wll be possble because G(β) s a monotonc functon of β so calculatng ts nverse can be done wth zero-fndng technques) Probablty Formula The probablty dstrbuton p(a ) we want has been derved by others. It s a functon of the dual varable β: p(a ) = 2 α 2 βg(a) (9.2) See a bography of Lagrange at hstory/bographes/lagrange.html

6 9.6 Maxmum Entropy, Sngle Constrant 09 whch mples log 2 = α + βg(a ) (9.3) p(a ) where α s a convenent abbrevaton 2 for ths functon of β: α = log 2 2 βg(a) (9.4) Note that ths formula for α guarantees that the p(a ) from Equaton 9.2 add up to as requred by Equaton 9.8. If β s known, the functon α and the probabltes p(a ) can be found and, f desred, the entropy S and the constrant varable G. In fact, f S s needed, t can be calculated drectly, wthout evaluatng the p(a ) ths s helpful f there are dozens or more probabltes to deal wth. Ths short-cut s found by multplyng Equaton 9.3 by p(a ), and summng over. The left-hand sde s S and the rght-hand sde smplfes because α and β are ndependent of. The result s where S, α, and G are all functons of β. S = α + βg (9.5) The Maxmum Entropy It s easy to show that the entropy calculated from ths probablty dstrbuton s at least as large as that for any probablty dstrbuton whch leads to the same expected value of G. Recall the Gbbs nequalty, Equaton 6.4, whch wll be rewrtten here wth p(a ) and p (A ) nterchanged (t s vald ether way): p (A ) log 2 p p (A ) log (A ) 2 (9.6) p(a ) where p (A ) s any probablty dstrbuton and p(a ) s any other probablty dstrbuton. The nequalty s an equalty f and only f the two probablty dstrbutons are the same. The Gbbs nequalty can be used to prove that the probablty dstrbuton of Equaton 9.2 has the maxmum entropy. Suppose there s another probablty dstrbuton p (A ) that leads to an expected value G and an entropy S,.e., = p (A ) (9.7) G = p (A )g(a ) (9.8) S = p (A ) log 2 p (A ) (9.9) Then t s easy to show that, for any value of β, f G = G(β) then S S(β): 2 The functon α(β) s related to the partton functon Z(β) of statstcal physcs: Z = 2 α or α = log 2 Z.

7 9.6 Maxmum Entropy, Sngle Constrant 0 S = p (A ) log 2 p (A ) p (A ) log 2 p(a ) = p (A )[α + βg(a )] = α + βg = S(β) + β[g G(β)] (9.20) where Equatons 9.6, 9.3, 9.7, 9.8, and 9.5 were used. Thus the entropy assocated wth any alternatve proposed probablty dstrbuton that leads to the same value for the constrant varable cannot exceed the entropy for the dstrbuton that uses β Evaluatng the Dual Varable So far we are consderng the dual varable β to be an ndependent varable. If we start wth a known value G, we want to use G as an ndependent varable and calculate β n terms of t. In other words, we need to nvert the functon G(β), or fnd β such that Equaton 9.9 s satsfed. Ths task s not trval; n fact most of the computatonal dffculty assocated wth the Prncple of Maxmum Entropy les n ths step. If there are a modest number of states and only one constrant n addton to the equaton nvolvng the sum of the probabltes, ths step s not hard, as we wll see. If there are more constrants ths step becomes ncreasngly complcated, and f there are a large number of states the calculatons cannot be done. In the case of more realstc models for physcal systems, ths summaton s mpossble to calculate, although the general relatons among the quanttes other than p(a ) reman vald. To fnd β, start wth Equaton 9.2 for p(a ), multply t by g(a ) and by 2 α, and sum over the probabltes. The left hand sde becomes G(β)2 α, because nether α nor G(β) depend on. We already have an expresson for α n terms of β (Equaton 9.4), so the left hand sde becomes G(β)2 βg(a). The rght hand sde becomes g(a )2 βg(a). Thus, 0 = [g(a ) G(β)]2 βg(a) (9.2) If ths equaton s multpled by 2 βg(β), the result s where the functon f(β) s 0 = f(β) (9.22) f(β) = [g(a ) G(β)]2 β[g(a) G(β)] (9.23) Equaton 9.22 s the fundamental equaton that s to be solved for partcular values of G(β), for example G. The functon f(β) depends on the model of the problem (.e., the varous g(a )), and on G, and that s all. It does not depend explctly on α or the probabltes p(a ). How do we know that there s any value of β for whch f(β) = 0? Frst, notce that snce G les between the smallest and the largest g(a ), there s at least one for whch (g(a ) G ) s postve and at least one for whch t s negatve. It s not dffcult to show that f(β) s a monotonc functon of β, n the sense that f β 2 > β then f(β 2 ) < f(β ). For large postve values of β, the domnant term n the sum s the one that has the smallest value of g(a ), and hence f s negatve. Smlarly, for large negatve values of β, f s postve. It must therefore be zero for one and only one value of β (ths reasonng reles on the fact that f(β) s a contnuous functon.)

8 9.6 Maxmum Entropy, Sngle Constrant Examples For the Berger s Burgers example, suppose that you are told the average meal prce s $2.50, and you want to estmate the probabltes p(b), p(c), p(f ), and p(t ). Here s what you know: = p(b) + p(c) + p(f ) + p(t ) (9.24) 0 = $.00p(B) + $2.00p(C) + $3.00p(F ) + $8.00p(T ) $2.50 (9.25) S = p(b) log 2 + p(c) log 2 + p(f ) log 2 + p(t ) log p(b) p(c) p(f ) 2 p(t ) (9.26) The entropy s the largest, subject to the constrants, f where and β s the value for whch f(β) = 0 where p(b) = 2 α 2 β$.00 (9.27) p(c) = 2 α 2 β$2.00 (9.28) p(f ) = 2 α 2 β$3.00 (9.29) p(t ) = 2 α 2 β$8.00 (9.30) α = log 2 (2 β$ β$ β$ β$8.00 ) (9.3) f(β) = $ $0.50β + $ $5.50β $.50 2 $.50β $ $0.50β (9.32) A lttle tral and error (or use of a zero-fndng program) gves β = bts/dollar, α =.237 bts, p(b) = , p(c) = , p(f ) = , p(t ) = 0.0, and S =.8835 bts. The entropy s smaller than the 2 bts whch would be requred to encode a sngle order of one of the four possble meals usng a fxed-length code. Ths s because knowledge of the average prce reduces our uncertanty somewhat. If more nformaton s known about the orders then a probablty dstrbuton that ncorporates that nformaton would have even lower entropy. For the magnetc dpole example, we carry the dervaton out wth the magnetc feld H set at some unspecfed value. The results all depend on H as well as E. = p(u) + p(d) (9.33) Ẽ = e(u)p(u) + e(d)p(d) = m d H[p(U) p(d)] (9.34) S = p(u) log 2 + p(d) log p(a) 2 p(d) (9.35) The entropy s the largest, for the energy Ẽ and magnetc feld H, f p(u) = 2 α 2 βm dh p(d) = 2 α 2 βm dh (9.36) (9.37)

9 9.6 Maxmum Entropy, Sngle Constrant 2 where and β s the value for whch f(β) = 0 where α = log 2 (2 βm d H + 2 βm dh ) (9.38) f(β) = (m d H Ẽ)2 β(m dh E e ) (m d H + Ẽ)2 β(m d H+E e ) (9.39) Note that ths example wth only one dpole, and therefore only two states, does not actually requre the Prncple of Maxmum Entropy because there are two equatons n two unknowns, p(u) and p(d) (you can solve Equaton 9.39 for β usng algebra). If there were two dpoles, there would be four states and algebra would not have been suffcent. If there were many more than four possble states, ths procedure to calculate β would have been mpractcal or at least very dffcult. We therefore ask, n Chapter of these notes, what we can tell about the varous quanttes even f we cannot actually calculate numercal values for them usng the summaton over states.

10 MIT OpenCourseWare J / 2.0J Informaton and Entropy Sprng 2008 For nformaton about ctng these materals or our Terms of Use, vst:

Temperature. Chapter Heat Engine

Temperature. Chapter Heat Engine Chapter 3 Temperature In prevous chapters of these notes we ntroduced the Prncple of Maxmum ntropy as a technque for estmatng probablty dstrbutons consstent wth constrants. In Chapter 9 we dscussed the

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Thermodynamics and statistical mechanics in materials modelling II

Thermodynamics and statistical mechanics in materials modelling II Course MP3 Lecture 8/11/006 (JAE) Course MP3 Lecture 8/11/006 Thermodynamcs and statstcal mechancs n materals modellng II A bref résumé of the physcal concepts used n materals modellng Dr James Ellott.1

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

University of Washington Department of Chemistry Chemistry 453 Winter Quarter 2015

University of Washington Department of Chemistry Chemistry 453 Winter Quarter 2015 Lecture 2. 1/07/15-1/09/15 Unversty of Washngton Department of Chemstry Chemstry 453 Wnter Quarter 2015 We are not talkng about truth. We are talkng about somethng that seems lke truth. The truth we want

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

Lecture 7: Boltzmann distribution & Thermodynamics of mixing

Lecture 7: Boltzmann distribution & Thermodynamics of mixing Prof. Tbbtt Lecture 7 etworks & Gels Lecture 7: Boltzmann dstrbuton & Thermodynamcs of mxng 1 Suggested readng Prof. Mark W. Tbbtt ETH Zürch 13 März 018 Molecular Drvng Forces Dll and Bromberg: Chapters

More information

Channel Encoder. Channel. Figure 7.1: Communication system

Channel Encoder. Channel. Figure 7.1: Communication system Chapter 7 Processes The model of a communcaton system that we have been developng s shown n Fgure 7.. Ths model s also useful for some computaton systems. The source s assumed to emt a stream of symbols.

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy ESCI 341 Atmospherc Thermodynamcs Lesson 10 The Physcal Meanng of Entropy References: An Introducton to Statstcal Thermodynamcs, T.L. Hll An Introducton to Thermodynamcs and Thermostatstcs, H.B. Callen

More information

Lecture 17 : Stochastic Processes II

Lecture 17 : Stochastic Processes II : Stochastc Processes II 1 Contnuous-tme stochastc process So far we have studed dscrete-tme stochastc processes. We studed the concept of Makov chans and martngales, tme seres analyss, and regresson analyss

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

Temperature. Chapter Temperature Scales

Temperature. Chapter Temperature Scales Chapter 12 Temperature In prevous chapters of these notes we ntroduced the Prncple of Maxmum Entropy as a technque for estmatng probablty dstrbutons consstent wth constrants. In Chapter 8 we dscussed the

More information

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1

Physics 5153 Classical Mechanics. D Alembert s Principle and The Lagrangian-1 P. Guterrez Physcs 5153 Classcal Mechancs D Alembert s Prncple and The Lagrangan 1 Introducton The prncple of vrtual work provdes a method of solvng problems of statc equlbrum wthout havng to consder the

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Note on EM-training of IBM-model 1

Note on EM-training of IBM-model 1 Note on EM-tranng of IBM-model INF58 Language Technologcal Applcatons, Fall The sldes on ths subject (nf58 6.pdf) ncludng the example seem nsuffcent to gve a good grasp of what s gong on. Hence here are

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Integrals and Invariants of Euler-Lagrange Equations

Integrals and Invariants of Euler-Lagrange Equations Lecture 16 Integrals and Invarants of Euler-Lagrange Equatons ME 256 at the Indan Insttute of Scence, Bengaluru Varatonal Methods and Structural Optmzaton G. K. Ananthasuresh Professor, Mechancal Engneerng,

More information

Section 8.3 Polar Form of Complex Numbers

Section 8.3 Polar Form of Complex Numbers 80 Chapter 8 Secton 8 Polar Form of Complex Numbers From prevous classes, you may have encountered magnary numbers the square roots of negatve numbers and, more generally, complex numbers whch are the

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Lecture 4. Macrostates and Microstates (Ch. 2 )

Lecture 4. Macrostates and Microstates (Ch. 2 ) Lecture 4. Macrostates and Mcrostates (Ch. ) The past three lectures: we have learned about thermal energy, how t s stored at the mcroscopc level, and how t can be transferred from one system to another.

More information

Hopfield Training Rules 1 N

Hopfield Training Rules 1 N Hopfeld Tranng Rules To memorse a sngle pattern Suppose e set the eghts thus - = p p here, s the eght beteen nodes & s the number of nodes n the netor p s the value requred for the -th node What ll the

More information

Moments of Inertia. and reminds us of the analogous equation for linear momentum p= mv, which is of the form. The kinetic energy of the body is.

Moments of Inertia. and reminds us of the analogous equation for linear momentum p= mv, which is of the form. The kinetic energy of the body is. Moments of Inerta Suppose a body s movng on a crcular path wth constant speed Let s consder two quanttes: the body s angular momentum L about the center of the crcle, and ts knetc energy T How are these

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal

Inner Product. Euclidean Space. Orthonormal Basis. Orthogonal Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential Open Systems: Chemcal Potental and Partal Molar Quanttes Chemcal Potental For closed systems, we have derved the followng relatonshps: du = TdS pdv dh = TdS + Vdp da = SdT pdv dg = VdP SdT For open systems,

More information

STATISTICAL MECHANICS

STATISTICAL MECHANICS STATISTICAL MECHANICS Thermal Energy Recall that KE can always be separated nto 2 terms: KE system = 1 2 M 2 total v CM KE nternal Rgd-body rotaton and elastc / sound waves Use smplfyng assumptons KE of

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Physics 5153 Classical Mechanics. Principle of Virtual Work-1

Physics 5153 Classical Mechanics. Principle of Virtual Work-1 P. Guterrez 1 Introducton Physcs 5153 Classcal Mechancs Prncple of Vrtual Work The frst varatonal prncple we encounter n mechancs s the prncple of vrtual work. It establshes the equlbrum condton of a mechancal

More information

Comparison of Regression Lines

Comparison of Regression Lines STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence

More information

Rate of Absorption and Stimulated Emission

Rate of Absorption and Stimulated Emission MIT Department of Chemstry 5.74, Sprng 005: Introductory Quantum Mechancs II Instructor: Professor Andre Tokmakoff p. 81 Rate of Absorpton and Stmulated Emsson The rate of absorpton nduced by the feld

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations

1 Derivation of Rate Equations from Single-Cell Conductance (Hodgkin-Huxley-like) Equations Physcs 171/271 -Davd Klenfeld - Fall 2005 (revsed Wnter 2011) 1 Dervaton of Rate Equatons from Sngle-Cell Conductance (Hodgkn-Huxley-lke) Equatons We consder a network of many neurons, each of whch obeys

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Physics 2A Chapters 6 - Work & Energy Fall 2017

Physics 2A Chapters 6 - Work & Energy Fall 2017 Physcs A Chapters 6 - Work & Energy Fall 017 These notes are eght pages. A quck summary: The work-energy theorem s a combnaton o Chap and Chap 4 equatons. Work s dened as the product o the orce actng on

More information

Solution Thermodynamics

Solution Thermodynamics Soluton hermodynamcs usng Wagner Notaton by Stanley. Howard Department of aterals and etallurgcal Engneerng South Dakota School of nes and echnology Rapd Cty, SD 57701 January 7, 001 Soluton hermodynamcs

More information

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced, FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then

More information

1 GSW Iterative Techniques for y = Ax

1 GSW Iterative Techniques for y = Ax 1 for y = A I m gong to cheat here. here are a lot of teratve technques that can be used to solve the general case of a set of smultaneous equatons (wrtten n the matr form as y = A), but ths chapter sn

More information

1 Generating functions, continued

1 Generating functions, continued Generatng functons, contnued. Generatng functons and parttons We can make use of generatng functons to answer some questons a bt more restrctve than we ve done so far: Queston : Fnd a generatng functon

More information

Density matrix. c α (t)φ α (q)

Density matrix. c α (t)φ α (q) Densty matrx Note: ths s supplementary materal. I strongly recommend that you read t for your own nterest. I beleve t wll help wth understandng the quantum ensembles, but t s not necessary to know t n

More information

Primer on High-Order Moment Estimators

Primer on High-Order Moment Estimators Prmer on Hgh-Order Moment Estmators Ton M. Whted July 2007 The Errors-n-Varables Model We wll start wth the classcal EIV for one msmeasured regressor. The general case s n Erckson and Whted Econometrc

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

x i1 =1 for all i (the constant ).

x i1 =1 for all i (the constant ). Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by

More information

Chapter 1. Probability

Chapter 1. Probability Chapter. Probablty Mcroscopc propertes of matter: quantum mechancs, atomc and molecular propertes Macroscopc propertes of matter: thermodynamcs, E, H, C V, C p, S, A, G How do we relate these two propertes?

More information

Professor Terje Haukaas University of British Columbia, Vancouver The Q4 Element

Professor Terje Haukaas University of British Columbia, Vancouver  The Q4 Element Professor Terje Haukaas Unversty of Brtsh Columba, ancouver www.nrsk.ubc.ca The Q Element Ths document consders fnte elements that carry load only n ther plane. These elements are sometmes referred to

More information

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis

Appendix for Causal Interaction in Factorial Experiments: Application to Conjoint Analysis A Appendx for Causal Interacton n Factoral Experments: Applcaton to Conjont Analyss Mathematcal Appendx: Proofs of Theorems A. Lemmas Below, we descrbe all the lemmas, whch are used to prove the man theorems

More information

Lecture 3 Stat102, Spring 2007

Lecture 3 Stat102, Spring 2007 Lecture 3 Stat0, Sprng 007 Chapter 3. 3.: Introducton to regresson analyss Lnear regresson as a descrptve technque The least-squares equatons Chapter 3.3 Samplng dstrbuton of b 0, b. Contnued n net lecture

More information

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models

Computation of Higher Order Moments from Two Multinomial Overdispersion Likelihood Models Computaton of Hgher Order Moments from Two Multnomal Overdsperson Lkelhood Models BY J. T. NEWCOMER, N. K. NEERCHAL Department of Mathematcs and Statstcs, Unversty of Maryland, Baltmore County, Baltmore,

More information

Mathematical Preparations

Mathematical Preparations 1 Introducton Mathematcal Preparatons The theory of relatvty was developed to explan experments whch studed the propagaton of electromagnetc radaton n movng coordnate systems. Wthn expermental error the

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Supplementary Notes for Chapter 9 Mixture Thermodynamics

Supplementary Notes for Chapter 9 Mixture Thermodynamics Supplementary Notes for Chapter 9 Mxture Thermodynamcs Key ponts Nne major topcs of Chapter 9 are revewed below: 1. Notaton and operatonal equatons for mxtures 2. PVTN EOSs for mxtures 3. General effects

More information

Fundamental loop-current method using virtual voltage sources technique for special cases

Fundamental loop-current method using virtual voltage sources technique for special cases Fundamental loop-current method usng vrtual voltage sources technque for specal cases George E. Chatzaraks, 1 Marna D. Tortorel 1 and Anastasos D. Tzolas 1 Electrcal and Electroncs Engneerng Departments,

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

The Second Anti-Mathima on Game Theory

The Second Anti-Mathima on Game Theory The Second Ant-Mathma on Game Theory Ath. Kehagas December 1 2006 1 Introducton In ths note we wll examne the noton of game equlbrum for three types of games 1. 2-player 2-acton zero-sum games 2. 2-player

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Entropy generation in a chemical reaction

Entropy generation in a chemical reaction Entropy generaton n a chemcal reacton E Mranda Área de Cencas Exactas COICET CCT Mendoza 5500 Mendoza, rgentna and Departamento de Físca Unversdad aconal de San Lus 5700 San Lus, rgentna bstract: Entropy

More information

Statistical mechanics handout 4

Statistical mechanics handout 4 Statstcal mechancs handout 4 Explan dfference between phase space and an. Ensembles As dscussed n handout three atoms n any physcal system can adopt any one of a large number of mcorstates. For a quantum

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM

ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM ELASTIC WAVE PROPAGATION IN A CONTINUOUS MEDIUM An elastc wave s a deformaton of the body that travels throughout the body n all drectons. We can examne the deformaton over a perod of tme by fxng our look

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

A how to guide to second quantization method.

A how to guide to second quantization method. Phys. 67 (Graduate Quantum Mechancs Sprng 2009 Prof. Pu K. Lam. Verson 3 (4/3/2009 A how to gude to second quantzaton method. -> Second quantzaton s a mathematcal notaton desgned to handle dentcal partcle

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations

Robert Eisberg Second edition CH 09 Multielectron atoms ground states and x-ray excitations Quantum Physcs 量 理 Robert Esberg Second edton CH 09 Multelectron atoms ground states and x-ray exctatons 9-01 By gong through the procedure ndcated n the text, develop the tme-ndependent Schroednger equaton

More information

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan. THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY Wllam A. Pearlman 2002 References: S. Armoto - IEEE Trans. Inform. Thy., Jan. 1972 R. Blahut - IEEE Trans. Inform. Thy., July 1972 Recall

More information

12. The Hamilton-Jacobi Equation Michael Fowler

12. The Hamilton-Jacobi Equation Michael Fowler 1. The Hamlton-Jacob Equaton Mchael Fowler Back to Confguraton Space We ve establshed that the acton, regarded as a functon of ts coordnate endponts and tme, satsfes ( ) ( ) S q, t / t+ H qpt,, = 0, and

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Physics 607 Exam 1. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2

Physics 607 Exam 1. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2 Physcs 607 Exam 1 Please be well-organzed, and show all sgnfcant steps clearly n all problems. You are graded on your wor, so please do not just wrte down answers wth no explanaton! Do all your wor on

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Conjugacy and the Exponential Family

Conjugacy and the Exponential Family CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the

More information

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva Econ 39 - Statstcal Propertes of the OLS estmator Sanjaya DeSlva September, 008 1 Overvew Recall that the true regresson model s Y = β 0 + β 1 X + u (1) Applyng the OLS method to a sample of data, we estmate

More information

ECEN 5005 Crystals, Nanocrystals and Device Applications Class 19 Group Theory For Crystals

ECEN 5005 Crystals, Nanocrystals and Device Applications Class 19 Group Theory For Crystals ECEN 5005 Crystals, Nanocrystals and Devce Applcatons Class 9 Group Theory For Crystals Dee Dagram Radatve Transton Probablty Wgner-Ecart Theorem Selecton Rule Dee Dagram Expermentally determned energy

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Power law and dimension of the maximum value for belief distribution with the max Deng entropy

Power law and dimension of the maximum value for belief distribution with the max Deng entropy Power law and dmenson of the maxmum value for belef dstrbuton wth the max Deng entropy Bngy Kang a, a College of Informaton Engneerng, Northwest A&F Unversty, Yanglng, Shaanx, 712100, Chna. Abstract Deng

More information

Implicit Integration Henyey Method

Implicit Integration Henyey Method Implct Integraton Henyey Method In realstc stellar evoluton codes nstead of a drect ntegraton usng for example the Runge-Kutta method one employs an teratve mplct technque. Ths s because the structure

More information

A particle in a state of uniform motion remain in that state of motion unless acted upon by external force.

A particle in a state of uniform motion remain in that state of motion unless acted upon by external force. The fundamental prncples of classcal mechancs were lad down by Galleo and Newton n the 16th and 17th centures. In 1686, Newton wrote the Prncpa where he gave us three laws of moton, one law of gravty,

More information

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud

Resource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Classes of States and Stationary Distributions

Topics in Probability Theory and Stochastic Processes Steven R. Dunbar. Classes of States and Stationary Distributions Steven R. Dunbar Department of Mathematcs 203 Avery Hall Unversty of Nebraska-Lncoln Lncoln, NE 68588-0130 http://www.math.unl.edu Voce: 402-472-3731 Fax: 402-472-8466 Topcs n Probablty Theory and Stochastc

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

14 Lagrange Multipliers

14 Lagrange Multipliers Lagrange Multplers 14 Lagrange Multplers The Method of Lagrange Multplers s a powerful technque for constraned optmzaton. Whle t has applcatons far beyond machne learnng t was orgnally developed to solve

More information