Complexity of Inference in Topic Models

Size: px
Start display at page:

Download "Complexity of Inference in Topic Models"

Transcription

1 Complexiy of Inference in Topic Models David Sonag, Daniel M. Roy Massachuses Insiue of Technology Absrac We consider he compuaional complexiy of finding he MAP assignmen of opics o words in Laen Dirichle Allocaion. We show ha, when he effecive number of opics per documen is small, exac inference aes polynomial ime. In conras, we show ha, when a documen has a large number of opics, finding he MAP assignmen in LDA is NP-hard. Our resuls moivae furher sudy of he srucure in real-world opic models, and raise a number of quesions abou he requiremens for accurae inference during boh learning and es-ime use of opic models. 1 Inroducion Probabilisic models of ex and opics, nown as opic models, are powerful ools for exploring large daa ses and for maing inferences abou he conen of documens. Topic models are frequenly used for deriving low-dimensional represenaions of documens ha are hen used for informaion rerieval, documen summarizaion, and classificaion [Blei & McAuliffe, 2008; Lacose-Julien e al., 2009]. Almos all uses of opic models require inference. For example, unsupervised learning of opic models using Expecaion Maximizaion requires he repeaed compuaion of marginal probabiliies of wha opics are presen in he documens. For applicaions in informaion rerieval and classificaion, each new documen necessiaes inference o deermine wha opics are presen. Alhough here is a wealh of lieraure on approximae inference algorihms for opic models, such Gibbs sampling and variaional inference [Blei e al., 2003; Griffihs & Seyvers, 2004; Muherjee & Blei, 2009; Poreous e al., 2008; Teh e al., 2007], lile is nown abou he complexiy of exac inference. In his paper, we consider he compuaional complexiy of inference in opic models, beginning wih one of he simples and mos popular models, Laen Dirichle Allocaion (LDA) [Blei e al., 2003]. We chose o sudy LDA because we believe ha i capures he essence of wha maes inference easy or hard in opic models. Our hope is ha our resuls will moivae discussion of he following quesions, guiding research of boh new opic models and approximae inference for opic models: 1. Wha is he srucure of real-world LDA inference problems? Migh here be srucure in naural problem insances ha maes hem differen from hard insances (e.g., hose used in our reducions)? 2. How much does having accurae inference affec he resuls of learning? Wih a large raining se or sufficienly long documens, migh here be enough averaging for learning o succeed even wih somewha inaccurae inference? 3. Wha are he requiremens of applicaions ha use es-ime inference? How accurae does es-ime inference need o be? Wha quaniies are needed (e.g., marginals, lielihood, mos liely assignmen)? 1

2 2 MAP inference We will consider he inference problem for a single documen. The LDA model saes ha he documen, represened as a collecion of words w = (w 1, w 2,..., w N ), is generaed as follows: a disribuion over he T opics is sampled from a Dirichle disribuion, θ Dir(α); hen, for i = 1,..., N, we sample a opic z i Mulinomial(θ) and word w i Pr(w z i ). Assume ha hese word disribuions have been previously esimaed, and denoe l i = log Pr(w i z i = ) as he log probabiliy of he ih word being generaed from opic. Afer inegraing ou he opic disribuion vecor, he join disribuion of he opic assignmens is given by Pr(z 1,..., z N ) = Γ( α ) Γ(n + α ) Γ(α ) Γ( α + N) where n is he oal number of words assigned o opic. N Pr(w i z i ), (1) In his paper, we will focus on he inference problem of finding he mos liely assignmen of opics o words, i.e. he maximum a poseriori (MAP) assignmen. Taing he logarihm of Eq. 1 and ignoring consans, finding he MAP assignmen is seen o be equivalen o he following combinaorial opimizaion problem: Φ = max x i {0,1},n subjec o i=1 lg Γ(n + α ) + x i l i (2) i, x i = 1, x i = n, where he indicaor variable x i = I[z i = ] denoes he assignmen of word i o opic. 2.1 Exac maximizaion for small number of opics Suppose a documen only uses τ T opics. Tha is, T could be large, bu we are guaraneed ha he MAP assignmen for a documen uses a mos τ differen opics. In his secion, we show how we can use his nowledge o efficienly find a maximizing assignmen of words o opics. We firs observe ha, if we new he number of words assigned o each opic, finding he MAP assignmen is easy. For i {1,..., T }, le n i be he number of words assigned o opic i in he MAP assignmen. Then, he MAP assignmen x is found by solving he following opimizaion problem: x i l i (3) max x i {0,1} subjec o i, x i = 1, i x i = n, which is equivalen o weighed b-maching in a biparie graph (he words are on one side, he opics on he oher) and can be opimally solved in ime O(bm 3 ), where b = max n = O(N) and m = N + T [Schrijver, 2003]. We call (n 1,..., n T ) a valid pariion when n i 0 and n = N. Using weighed b- maching, we can find a MAP assignmen of words o opics by rying all ( ) T τ = Θ(T τ ) choices of τ opics and all possible valid pariions wih a mos τ non-zeros. for all subses A {1, 2,..., T } such ha A = τ do for all valid pariions n = (n 1, n 2,..., n T ) such ha n = 0 for A do Φ A, n Weighed-B-Maching(A, n, l) + lg Γ(n + α ) end for end for reurn arg max A, n Φ A, n i 2

3 w 1 w 2 w 3 w 4 w w Figure 1: (Lef) A LDA insance derived from a -se pacing insance. (Cener) Plo of F (n ) = lg Γ(n + α) for various values of α. The x-axis varies n, he number of words assigned o opic, and he y-axis shows F (n ). (Righ) Behavior of lg Γ(n + α) as α 0. The funcion is sable everywhere bu a zero, where he reward for sparsiy increases wihou bound. There are a mos N τ valid pariions wih τ non-zero couns. For each of hese, we solve he b-maching problem o find he mos liely assignmen of words o opics ha saisfies he cardinaliy consrains. Thus, he oal running ime is O((NT ) τ N(N + τ) 3 ). This is racable when he number of opics τ appearing in a documen is a consan. 2.2 Inference is NP-hard for large numbers of opics In his secion, we show ha probabilisic inference is NP-hard in he general seing where a documen may have a large number of opics in is MAP assignmen. Le MAX-LDA(α) denoe he decision problem of wheher Φ > V (see Eq. 2) for some V R, where he hyperparameers α = α for all opics. We consider boh α < 1 and α 1 because, as shown in Figure 1, he opimizaion problem is qualiaively differen in hese wo cases. Theorem 1. MAX-LDA(α) is NP-hard for all α > 0. Proof. Our proof is a sraighforward generalizaion of he approach used by Halperin & Karp [200] o show ha he minimum enropy se cover problem is hard o approximae. The proof is done by reducion from -se pacing (-SP), for 3. In -SP, we are given a collecion of -elemen ses over some universe of elemens Σ wih Σ = n. The goal is o find he larges collecion of disjoin ses. There exiss a consan c > 1 such ha i is NP-hard o decide wheher a -SP insance has (i) a soluion wih n/ disjoin ses covering all elemens (called a perfec maching), or (ii) a mos cn/ disjoin ses (called a (cn/)-maching). We now describe how o consruc a LDA inference problem from a -SP insance. This requires specifying he words in he documen, he number of opics, and he word log probabiliies l i. Le each elemen i Σ correspond o a word w i, and le each se correspond o one opic. The documen consiss of all of he words (i.e., Σ). We assign uniform probabiliy o he words in each opic, so ha Pr(w i z i = ) = 1 for i, and 0 oherwise. Figure 1 illusraes he resuling LDA model. The opics are on he op, and he words from he documen are on he boom. An edge is drawn beween a opic (se) and a word (elemen) if he corresponding se conains ha elemen. Wha remains is o show ha we can solve some -SP problem by using his reducion and solving a MAX-LDA(α) problem. For echnical reasons involving α > 1, we require ha is sufficienly large. We will use he following resul, proved in he Appendix. Lemma 2. Le P be a -SP insance for > (1 + α) 2, and le P be he derived MAX- LDA(α) insance. There exiss consans C U and C L < C U such ha, if here is a perfec maching in P, hen Φ C U. If, on he oher hand, here is a mos a (cn/)-maching in P, hen Φ < C L. Le P be a -SP insance for > (3 + α) 2, P be he derived MAX-LDA(α) insance, and C U and C L < C U be as in Lemma 2. Then, by esing Φ < C L and Φ > C U we can decide wheher P is a perfec maching or a bes a (cn/)-maching. Hence -SP reduces o MAX-LDA(α). 3

4 The bold lines in Figure 1 indicae he MAP assignmen, which for his example corresponds o a perfec maching for he original -se pacing insance. More realisic documens would have significanly more words han opics used. Alhough his is no possible while eeping = 3, since he MAP assignmen always has τ N/, we can insead reduce from a -se pacing problem wih 3. Lemma 2 shows ha his is hard as well. 3 Conclusion In his paper, we have shown ha he complexiy of inference in LDA srongly depends on he effecive number of opics per documen. When we can guaranee ha a documen is generaed from a small number of opics (regardless of he number of opics in he model), MAX-LDA can be solved in polynomial ime. On he oher hand, if a documen can use an arbirary number of opics, MAX-LDA is NP-hard. The choice of hyperparameers for he Dirichle does no affec our resuls. I would be ineresing o show analogous resuls for compuing marginals and he pariion funcion. I is sraighforward o exend boh our posiive and negaive resuls o relaed models, such as probabilisic laen semanic analysis (PLSA) [Hofmann, 1999] or correlaed opic models [Blei & Laffery, 2006]. Appendix: Proof of Lemma 2 Proof of Lemma 2. Assume here are T ses each having 3 elemens, and le Φ be he opimal LDA objecive. Define F (n) = log Γ(n + α). Since l i is consan across all opics, he linear erm in Eq. 2 will be a consan K. Firs, noe ha, if here is a perfec maching, Φ n F () + (T n ) F (0) + K. (4) The F (0) erm is he conribuion of unused opics. Oherwise, assume ha he bes pacing has γ cn/ ses, each wih elemens. Then, by he properies of he log-gamma funcion, Φ γf () + n γ 1 F ( 1) + (T n ) F (0) + K, () where we assume, conservaively, ha all of he remaining words are explained by opics assigned ( 1) words. Also, since here was no perfec maching, here were a mos T n unused opics. Using our bound on γ, we have where Φ cn F () + n cn 1 F ( 1) + (T n ) F (0) + K (6) = cn n(1 c) F () + 1 F ( 1) + (T n ) F (0) + K (7) = dn F () + (T n ) F (0) + K, (8) d := c + (1 c)β, for β := F ( 1) F () 1. (9) Noe ha F ()/ as. Along wih he convexiy of F, i follows ha here exiss a 0 such ha β < 1 for all > 0. Noe ha > (3 + α) 2 suffices. This implies ha d < 1, which shows ha here is a non-zero gap beween he possible values of Φ. We have ha 1 1 as. Therefore, β < 1 for some if and only if he slope of F exceeds one a some poin. Noe ha he maximum concenraion objecive, F (n) = n log n, saisfies he condiions on F and, in paricular, we have β < 1 for = 3. 4

5 References Blei, David, & Laffery, John Correlaed Topic Models. Pages of: Weiss, Y., Schölopf, B., & Pla, J. (eds), Advances in Neural Informaion Processing Sysems 18. Cambridge, MA: MIT Press. Blei, David, & McAuliffe, Jon Supervised Topic Models. Pages of: Pla, J.C., Koller, D., Singer, Y., & Roweis, S. (eds), Advances in Neural Informaion Processing Sysems 20. Cambridge, MA: MIT Press. Blei, David M., Ng, Andrew Y., & Jordan, Michael I Laen Dirichle allocaion. J. Mach. Learn. Res., 3, Griffihs, Thomas L., & Seyvers, Mar Finding scienific opics. Proceedings of he Naional Academy of Sciences of he Unied Saes of America, 101(Suppl 1), Halperin, Eran, & Karp, Richard M The minimum-enropy se cover problem. Theor. Compu. Sci., 348(2), Hofmann, Thomas Probabilisic laen semanic indexing. Pages 0 7 of: SIGIR 99: Proceedings of he 22nd annual inernaional ACM SIGIR conference on Research and developmen in informaion rerieval. New Yor, NY, USA: ACM. Lacose-Julien, Simon, Sha, Fei, & Jordan, Michael DiscLDA: Discriminaive Learning for Dimensionaliy Reducion and Classificaion. Pages of: Koller, D., Schuurmans, D., Bengio, Y., & Boou, L. (eds), Advances in Neural Informaion Processing Sysems 21. Muherjee, Indraneel, & Blei, David M Relaive Performance Guaranees for Approximae Inference in Laen Dirichle Allocaion. Pages of: Koller, D., Schuurmans, D., Bengio, Y., & Boou, L. (eds), Advances in Neural Informaion Processing Sysems 21. Poreous, Ian, Newman, David, Ihler, Alexander, Asuncion, Arhur, Smyh, Padhraic, & Welling, Max Fas collapsed gibbs sampling for laen dirichle allocaion. Pages of: KDD 08: Proceeding of he 14h ACM SIGKDD inernaional conference on Knowledge discovery and daa mining. New Yor, NY, USA: ACM. Schrijver, Alexander Combinaorial opimizaion. Polyhedra and efficiency. Vol. A. Algorihms and Combinaorics, vol. 24. Berlin: Springer-Verlag. Pahs, flows, machings, Chapers Teh, Y. W., Newman, D., & Welling, M A Collapsed Variaional Bayesian Inference Algorihm for Laen Dirichle Allocaion. In: Advances in Neural Informaion Processing Sysems, vol. 19.

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

Notes for Lecture 17-18

Notes for Lecture 17-18 U.C. Berkeley CS278: Compuaional Complexiy Handou N7-8 Professor Luca Trevisan April 3-8, 2008 Noes for Lecure 7-8 In hese wo lecures we prove he firs half of he PCP Theorem, he Amplificaion Lemma, up

More information

Online Convex Optimization Example And Follow-The-Leader

Online Convex Optimization Example And Follow-The-Leader CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion

More information

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon 3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of

More information

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions Muli-Period Sochasic Models: Opimali of (s, S) Polic for -Convex Objecive Funcions Consider a seing similar o he N-sage newsvendor problem excep ha now here is a fixed re-ordering cos (> 0) for each (re-)order.

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Approximation Algorithms for Unique Games via Orthogonal Separators

Approximation Algorithms for Unique Games via Orthogonal Separators Approximaion Algorihms for Unique Games via Orhogonal Separaors Lecure noes by Konsanin Makarychev. Lecure noes are based on he papers [CMM06a, CMM06b, LM4]. Unique Games In hese lecure noes, we define

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

KINEMATICS IN ONE DIMENSION

KINEMATICS IN ONE DIMENSION KINEMATICS IN ONE DIMENSION PREVIEW Kinemaics is he sudy of how hings move how far (disance and displacemen), how fas (speed and velociy), and how fas ha how fas changes (acceleraion). We say ha an objec

More information

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model

Retrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model 1 Boolean and Vecor Space Rerieval Models Many slides in his secion are adaped from Prof. Joydeep Ghosh (UT ECE) who in urn adaped hem from Prof. Dik Lee (Univ. of Science and Tech, Hong Kong) Rerieval

More information

Random Walk with Anti-Correlated Steps

Random Walk with Anti-Correlated Steps Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

5.1 - Logarithms and Their Properties

5.1 - Logarithms and Their Properties Chaper 5 Logarihmic Funcions 5.1 - Logarihms and Their Properies Suppose ha a populaion grows according o he formula P 10, where P is he colony size a ime, in hours. When will he populaion be 2500? We

More information

On Measuring Pro-Poor Growth. 1. On Various Ways of Measuring Pro-Poor Growth: A Short Review of the Literature

On Measuring Pro-Poor Growth. 1. On Various Ways of Measuring Pro-Poor Growth: A Short Review of the Literature On Measuring Pro-Poor Growh 1. On Various Ways of Measuring Pro-Poor Growh: A Shor eview of he Lieraure During he pas en years or so here have been various suggesions concerning he way one should check

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Matlab and Python programming: how to get started

Matlab and Python programming: how to get started Malab and Pyhon programming: how o ge sared Equipping readers he skills o wrie programs o explore complex sysems and discover ineresing paerns from big daa is one of he main goals of his book. In his chaper,

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Lecture 20: Riccati Equations and Least Squares Feedback Control

Lecture 20: Riccati Equations and Least Squares Feedback Control 34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he

More information

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality

Matrix Versions of Some Refinements of the Arithmetic-Geometric Mean Inequality Marix Versions of Some Refinemens of he Arihmeic-Geomeric Mean Inequaliy Bao Qi Feng and Andrew Tonge Absrac. We esablish marix versions of refinemens due o Alzer ], Carwrigh and Field 4], and Mercer 5]

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still. Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

8. Basic RL and RC Circuits

8. Basic RL and RC Circuits 8. Basic L and C Circuis This chaper deals wih he soluions of he responses of L and C circuis The analysis of C and L circuis leads o a linear differenial equaion This chaper covers he following opics

More information

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function STA 114: Saisics Noes 2. Saisical Models and he Likelihood Funcion Describing Daa & Saisical Models A physicis has a heory ha makes a precise predicion of wha s o be observed in daa. If he daa doesn mach

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

Forecasting optimally

Forecasting optimally I) ile: Forecas Evaluaion II) Conens: Evaluaing forecass, properies of opimal forecass, esing properies of opimal forecass, saisical comparison of forecas accuracy III) Documenaion: - Diebold, Francis

More information

Optimal Server Assignment in Multi-Server

Optimal Server Assignment in Multi-Server Opimal Server Assignmen in Muli-Server 1 Queueing Sysems wih Random Conneciviies Hassan Halabian, Suden Member, IEEE, Ioannis Lambadaris, Member, IEEE, arxiv:1112.1178v2 [mah.oc] 21 Jun 2013 Yannis Viniois,

More information

Some Ramsey results for the n-cube

Some Ramsey results for the n-cube Some Ramsey resuls for he n-cube Ron Graham Universiy of California, San Diego Jozsef Solymosi Universiy of Briish Columbia, Vancouver, Canada Absrac In his noe we esablish a Ramsey-ype resul for cerain

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

A Dynamic Model of Economic Fluctuations

A Dynamic Model of Economic Fluctuations CHAPTER 15 A Dynamic Model of Economic Flucuaions Modified for ECON 2204 by Bob Murphy 2016 Worh Publishers, all righs reserved IN THIS CHAPTER, OU WILL LEARN: how o incorporae dynamics ino he AD-AS model

More information

4.1 - Logarithms and Their Properties

4.1 - Logarithms and Their Properties Chaper 4 Logarihmic Funcions 4.1 - Logarihms and Their Properies Wha is a Logarihm? We define he common logarihm funcion, simply he log funcion, wrien log 10 x log x, as follows: If x is a posiive number,

More information

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details! MAT 257, Handou 6: Ocober 7-2, 20. I. Assignmen. Finish reading Chaper 2 of Spiva, rereading earlier secions as necessary. handou and fill in some missing deails! II. Higher derivaives. Also, read his

More information

Failure of the work-hamiltonian connection for free energy calculations. Abstract

Failure of the work-hamiltonian connection for free energy calculations. Abstract Failure of he work-hamilonian connecion for free energy calculaions Jose M. G. Vilar 1 and J. Miguel Rubi 1 Compuaional Biology Program, Memorial Sloan-Keering Cancer Cener, 175 York Avenue, New York,

More information

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes Represening Periodic Funcions by Fourier Series 3. Inroducion In his Secion we show how a periodic funcion can be expressed as a series of sines and cosines. We begin by obaining some sandard inegrals

More information

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18 A Firs ourse on Kineics and Reacion Engineering lass 19 on Uni 18 Par I - hemical Reacions Par II - hemical Reacion Kineics Where We re Going Par III - hemical Reacion Engineering A. Ideal Reacors B. Perfecly

More information

Macroeconomic Theory Ph.D. Qualifying Examination Fall 2005 ANSWER EACH PART IN A SEPARATE BLUE BOOK. PART ONE: ANSWER IN BOOK 1 WEIGHT 1/3

Macroeconomic Theory Ph.D. Qualifying Examination Fall 2005 ANSWER EACH PART IN A SEPARATE BLUE BOOK. PART ONE: ANSWER IN BOOK 1 WEIGHT 1/3 Macroeconomic Theory Ph.D. Qualifying Examinaion Fall 2005 Comprehensive Examinaion UCLA Dep. of Economics You have 4 hours o complee he exam. There are hree pars o he exam. Answer all pars. Each par has

More information

Technical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models.

Technical Report Doc ID: TR March-2013 (Last revision: 23-February-2016) On formulating quadratic functions in optimization models. Technical Repor Doc ID: TR--203 06-March-203 (Las revision: 23-Februar-206) On formulaing quadraic funcions in opimizaion models. Auhor: Erling D. Andersen Convex quadraic consrains quie frequenl appear

More information

We just finished the Erdős-Stone Theorem, and ex(n, F ) (1 1/(χ(F ) 1)) ( n

We just finished the Erdős-Stone Theorem, and ex(n, F ) (1 1/(χ(F ) 1)) ( n Lecure 3 - Kövari-Sós-Turán Theorem Jacques Versraëe jacques@ucsd.edu We jus finished he Erdős-Sone Theorem, and ex(n, F ) ( /(χ(f ) )) ( n 2). So we have asympoics when χ(f ) 3 bu no when χ(f ) = 2 i.e.

More information

INDEPENDENT SETS IN GRAPHS WITH GIVEN MINIMUM DEGREE

INDEPENDENT SETS IN GRAPHS WITH GIVEN MINIMUM DEGREE INDEPENDENT SETS IN GRAPHS WITH GIVEN MINIMUM DEGREE JAMES ALEXANDER, JONATHAN CUTLER, AND TIM MINK Absrac The enumeraion of independen ses in graphs wih various resricions has been a opic of much ineres

More information

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology

More information

Christos Papadimitriou & Luca Trevisan November 22, 2016

Christos Papadimitriou & Luca Trevisan November 22, 2016 U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream

More information

) were both constant and we brought them from under the integral.

) were both constant and we brought them from under the integral. YIELD-PER-RECRUIT (coninued The yield-per-recrui model applies o a cohor, bu we saw in he Age Disribuions lecure ha he properies of a cohor do no apply in general o a collecion of cohors, which is wha

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Lecture 33: November 29

Lecture 33: November 29 36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure

More information

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

Explaining Total Factor Productivity. Ulrich Kohli University of Geneva December 2015

Explaining Total Factor Productivity. Ulrich Kohli University of Geneva December 2015 Explaining Toal Facor Produciviy Ulrich Kohli Universiy of Geneva December 2015 Needed: A Theory of Toal Facor Produciviy Edward C. Presco (1998) 2 1. Inroducion Toal Facor Produciviy (TFP) has become

More information

Single and Double Pendulum Models

Single and Double Pendulum Models Single and Double Pendulum Models Mah 596 Projec Summary Spring 2016 Jarod Har 1 Overview Differen ypes of pendulums are used o model many phenomena in various disciplines. In paricular, single and double

More information

5.2. The Natural Logarithm. Solution

5.2. The Natural Logarithm. Solution 5.2 The Naural Logarihm The number e is an irraional number, similar in naure o π. Is non-erminaing, non-repeaing value is e 2.718 281 828 59. Like π, e also occurs frequenly in naural phenomena. In fac,

More information

Comparing Means: t-tests for One Sample & Two Related Samples

Comparing Means: t-tests for One Sample & Two Related Samples Comparing Means: -Tess for One Sample & Two Relaed Samples Using he z-tes: Assumpions -Tess for One Sample & Two Relaed Samples The z-es (of a sample mean agains a populaion mean) is based on he assumpion

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

GENERALIZATION OF THE FORMULA OF FAA DI BRUNO FOR A COMPOSITE FUNCTION WITH A VECTOR ARGUMENT

GENERALIZATION OF THE FORMULA OF FAA DI BRUNO FOR A COMPOSITE FUNCTION WITH A VECTOR ARGUMENT Inerna J Mah & Mah Sci Vol 4, No 7 000) 48 49 S0670000970 Hindawi Publishing Corp GENERALIZATION OF THE FORMULA OF FAA DI BRUNO FOR A COMPOSITE FUNCTION WITH A VECTOR ARGUMENT RUMEN L MISHKOV Received

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION SUPPLEMENTARY INFORMATION DOI: 0.038/NCLIMATE893 Temporal resoluion and DICE * Supplemenal Informaion Alex L. Maren and Sephen C. Newbold Naional Cener for Environmenal Economics, US Environmenal Proecion

More information

Navneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi

Navneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi Creep in Viscoelasic Subsances Numerical mehods o calculae he coefficiens of he Prony equaion using creep es daa and Herediary Inegrals Mehod Navnee Saini, Mayank Goyal, Vishal Bansal (23); Term Projec

More information

Stability and Bifurcation in a Neural Network Model with Two Delays

Stability and Bifurcation in a Neural Network Model with Two Delays Inernaional Mahemaical Forum, Vol. 6, 11, no. 35, 175-1731 Sabiliy and Bifurcaion in a Neural Nework Model wih Two Delays GuangPing Hu and XiaoLing Li School of Mahemaics and Physics, Nanjing Universiy

More information

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H. ACE 564 Spring 2006 Lecure 7 Exensions of The Muliple Regression Model: Dumm Independen Variables b Professor Sco H. Irwin Readings: Griffihs, Hill and Judge. "Dumm Variables and Varing Coefficien Models

More information

Chapter 3 Boundary Value Problem

Chapter 3 Boundary Value Problem Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le

More information

SZG Macro 2011 Lecture 3: Dynamic Programming. SZG macro 2011 lecture 3 1

SZG Macro 2011 Lecture 3: Dynamic Programming. SZG macro 2011 lecture 3 1 SZG Macro 2011 Lecure 3: Dynamic Programming SZG macro 2011 lecure 3 1 Background Our previous discussion of opimal consumpion over ime and of opimal capial accumulaion sugges sudying he general decision

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

BU Macro BU Macro Fall 2008, Lecture 4

BU Macro BU Macro Fall 2008, Lecture 4 Dynamic Programming BU Macro 2008 Lecure 4 1 Ouline 1. Cerainy opimizaion problem used o illusrae: a. Resricions on exogenous variables b. Value funcion c. Policy funcion d. The Bellman equaion and an

More information

The expectation value of the field operator.

The expectation value of the field operator. The expecaion value of he field operaor. Dan Solomon Universiy of Illinois Chicago, IL dsolom@uic.edu June, 04 Absrac. Much of he mahemaical developmen of quanum field heory has been in suppor of deermining

More information

Expert Advice for Amateurs

Expert Advice for Amateurs Exper Advice for Amaeurs Ernes K. Lai Online Appendix - Exisence of Equilibria The analysis in his secion is performed under more general payoff funcions. Wihou aking an explici form, he payoffs of he

More information

Lab #2: Kinematics in 1-Dimension

Lab #2: Kinematics in 1-Dimension Reading Assignmen: Chaper 2, Secions 2-1 hrough 2-8 Lab #2: Kinemaics in 1-Dimension Inroducion: The sudy of moion is broken ino wo main areas of sudy kinemaics and dynamics. Kinemaics is he descripion

More information

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data Chaper 2 Models, Censoring, and Likelihood for Failure-Time Daa William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based

More information

DEPARTMENT OF STATISTICS

DEPARTMENT OF STATISTICS A Tes for Mulivariae ARCH Effecs R. Sco Hacker and Abdulnasser Haemi-J 004: DEPARTMENT OF STATISTICS S-0 07 LUND SWEDEN A Tes for Mulivariae ARCH Effecs R. Sco Hacker Jönköping Inernaional Business School

More information

A Forward-Backward Splitting Method with Component-wise Lazy Evaluation for Online Structured Convex Optimization

A Forward-Backward Splitting Method with Component-wise Lazy Evaluation for Online Structured Convex Optimization A Forward-Backward Spliing Mehod wih Componen-wise Lazy Evaluaion for Online Srucured Convex Opimizaion Yukihiro Togari and Nobuo Yamashia March 28, 2016 Absrac: We consider large-scale opimizaion problems

More information

SOME MORE APPLICATIONS OF THE HAHN-BANACH THEOREM

SOME MORE APPLICATIONS OF THE HAHN-BANACH THEOREM SOME MORE APPLICATIONS OF THE HAHN-BANACH THEOREM FRANCISCO JAVIER GARCÍA-PACHECO, DANIELE PUGLISI, AND GUSTI VAN ZYL Absrac We give a new proof of he fac ha equivalen norms on subspaces can be exended

More information

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate. Inroducion Gordon Model (1962): D P = r g r = consan discoun rae, g = consan dividend growh rae. If raional expecaions of fuure discoun raes and dividend growh vary over ime, so should he D/P raio. Since

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

A Shooting Method for A Node Generation Algorithm

A Shooting Method for A Node Generation Algorithm A Shooing Mehod for A Node Generaion Algorihm Hiroaki Nishikawa W.M.Keck Foundaion Laboraory for Compuaional Fluid Dynamics Deparmen of Aerospace Engineering, Universiy of Michigan, Ann Arbor, Michigan

More information

Maintenance Models. Prof. Robert C. Leachman IEOR 130, Methods of Manufacturing Improvement Spring, 2011

Maintenance Models. Prof. Robert C. Leachman IEOR 130, Methods of Manufacturing Improvement Spring, 2011 Mainenance Models Prof Rober C Leachman IEOR 3, Mehods of Manufacuring Improvemen Spring, Inroducion The mainenance of complex equipmen ofen accouns for a large porion of he coss associaed wih ha equipmen

More information

PCP Theorem by Gap Amplification

PCP Theorem by Gap Amplification PCP Theorem by Gap Amplificaion Bernhard Vesenmayer JASS 2006 Absrac The PCP Theorem provides a new classificaion of NP. Since he original proof by [AS98], several new proofs occured. While he firs proof

More information

Today: Graphing. Note: I hope this joke will be funnier (or at least make you roll your eyes and say ugh ) after class. v (miles per hour ) Time

Today: Graphing. Note: I hope this joke will be funnier (or at least make you roll your eyes and say ugh ) after class. v (miles per hour ) Time +v Today: Graphing v (miles per hour ) 9 8 7 6 5 4 - - Time Noe: I hope his joke will be funnier (or a leas make you roll your eyes and say ugh ) afer class. Do yourself a favor! Prof Sarah s fail-safe

More information

Reliability of Technical Systems

Reliability of Technical Systems eliabiliy of Technical Sysems Main Topics Inroducion, Key erms, framing he problem eliabiliy parameers: Failure ae, Failure Probabiliy, Availabiliy, ec. Some imporan reliabiliy disribuions Componen reliabiliy

More information

3.6 Derivatives as Rates of Change

3.6 Derivatives as Rates of Change 3.6 Derivaives as Raes of Change Problem 1 John is walking along a sraigh pah. His posiion a he ime >0 is given by s = f(). He sars a =0from his house (f(0) = 0) and he graph of f is given below. (a) Describe

More information

Math 10B: Mock Mid II. April 13, 2016

Math 10B: Mock Mid II. April 13, 2016 Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.

More information

Appendix to Online l 1 -Dictionary Learning with Application to Novel Document Detection

Appendix to Online l 1 -Dictionary Learning with Application to Novel Document Detection Appendix o Online l -Dicionary Learning wih Applicaion o Novel Documen Deecion Shiva Prasad Kasiviswanahan Huahua Wang Arindam Banerjee Prem Melville A Background abou ADMM In his secion, we give a brief

More information

Some Basic Information about M-S-D Systems

Some Basic Information about M-S-D Systems Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,

More information

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

Lecture 2 October ε-approximation of 2-player zero-sum games

Lecture 2 October ε-approximation of 2-player zero-sum games Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion

More information

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course OMP: Arificial Inelligence Fundamenals Lecure 0 Very Brief Overview Lecurer: Email: Xiao-Jun Zeng x.zeng@mancheser.ac.uk Overview This course will focus mainly on probabilisic mehods in AI We shall presen

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

Stable Scheduling Policies for Maximizing Throughput in Generalized Constrained Queueing Systems

Stable Scheduling Policies for Maximizing Throughput in Generalized Constrained Queueing Systems 1 Sable Scheduling Policies for Maximizing Throughpu in Generalized Consrained Queueing Sysems Prasanna Chaporar, Suden Member, IEEE, Saswai Sarar, Member, IEEE Absrac We consider a class of queueing newors

More information

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given

More information

Unsteady Flow Problems

Unsteady Flow Problems School of Mechanical Aerospace and Civil Engineering Unseady Flow Problems T. J. Craf George Begg Building, C41 TPFE MSc CFD-1 Reading: J. Ferziger, M. Peric, Compuaional Mehods for Fluid Dynamics H.K.

More information

Lecture 10: The Poincaré Inequality in Euclidean space

Lecture 10: The Poincaré Inequality in Euclidean space Deparmens of Mahemaics Monana Sae Universiy Fall 215 Prof. Kevin Wildrick n inroducion o non-smooh analysis and geomery Lecure 1: The Poincaré Inequaliy in Euclidean space 1. Wha is he Poincaré inequaliy?

More information

2. Nonlinear Conservation Law Equations

2. Nonlinear Conservation Law Equations . Nonlinear Conservaion Law Equaions One of he clear lessons learned over recen years in sudying nonlinear parial differenial equaions is ha i is generally no wise o ry o aack a general class of nonlinear

More information

Solutions to Assignment 1

Solutions to Assignment 1 MA 2326 Differenial Equaions Insrucor: Peronela Radu Friday, February 8, 203 Soluions o Assignmen. Find he general soluions of he following ODEs: (a) 2 x = an x Soluion: I is a separable equaion as we

More information

Written HW 9 Sol. CS 188 Fall Introduction to Artificial Intelligence

Written HW 9 Sol. CS 188 Fall Introduction to Artificial Intelligence CS 188 Fall 2018 Inroducion o Arificial Inelligence Wrien HW 9 Sol. Self-assessmen due: Tuesday 11/13/2018 a 11:59pm (submi via Gradescope) For he self assessmen, fill in he self assessmen boxes in your

More information

Latent Spaces and Matrix Factorization

Latent Spaces and Matrix Factorization Compuaional Linguisics Laen Spaces and Marix Facorizaion Dierich Klakow FR 4.7 Allgemeine Linguisik (Compuerlinguisik) Universiä des Saarlandes Summer 0 Goal Goal: rea documen clusering and word clusering

More information

Appendix 14.1 The optimal control problem and its solution using

Appendix 14.1 The optimal control problem and its solution using 1 Appendix 14.1 he opimal conrol problem and is soluion using he maximum principle NOE: Many occurrences of f, x, u, and in his file (in equaions or as whole words in ex) are purposefully in bold in order

More information

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070

More information

Reading from Young & Freedman: For this topic, read sections 25.4 & 25.5, the introduction to chapter 26 and sections 26.1 to 26.2 & 26.4.

Reading from Young & Freedman: For this topic, read sections 25.4 & 25.5, the introduction to chapter 26 and sections 26.1 to 26.2 & 26.4. PHY1 Elecriciy Topic 7 (Lecures 1 & 11) Elecric Circuis n his opic, we will cover: 1) Elecromoive Force (EMF) ) Series and parallel resisor combinaions 3) Kirchhoff s rules for circuis 4) Time dependence

More information