Gibbs Sampler Derivation for Latent Dirichlet Allocation (Blei et al., 2003) Lecture Notes Arjun Mukherjee (UH)
|
|
- Prudence Bradley
- 5 years ago
- Views:
Transcription
1 Gibbs Sampler Derivation for Latent Dirichlet Allocation (Blei et al., 2003) Lecture Notes Arjun Mukherjee (UH) I. Generative process, Plates, Notations II. The joint distribution: PP (WW, ZZ; αα; ββ) = PP (WW ZZ)PP (ZZ) = II 1 II 2 II 1 = PP (WW ZZ) = (PP (WW ZZ, Φ)PP (Φ))ddΦ II 2 = PP (ZZ) = PP (ZZ Θ)PP (Θ)ddΘ Let us solve for II 2 first II 2 = PP (ZZ) = PP (ZZ Θ)PP (Θ)ddΘ From definition, Θ = {θθ 1, θθ 2, θθ } < θθ dd dd {1 } > PP (Θ) = PP (θθ dd αα) Now, we know, θθ dd ~ (αα). Recall that θθ dd = {θθ dd,1, θθ dd,2, θθ dd, } < θθ dd,kk tt {1 } > Recall that XX =< xx 1, xx > ~(αα 1, αα ) has the following PDF: ff(xx 1, xx ; αα 1, αα ) = 1 (xx BB(αα) ii )αα ii 1 ii=1, where BB(αα) = Γ(αα ii ) Γ( αα ii ) Also since integrating the PDF over the simplex must equal 1 (from definition), we have
2 Thus, (xx BB(αα) ii )αα ii 1 ii=1 [ 1 ]dddd = 1 or [ ii=1 (xx ii ) αα ii 1 ]dddd = BB(αα) (Identity A) PP (Θ) = PP (θθ dd αα) = ( 1 ((θθ BB(αα) dd,kk )αα 1 ) ) (1) In our case, we assume each hyper-parameter of the K dimensional Dir, αα 1 = = αα = αα, i.e., the vector αα =< αα 1, αα > = < αα, αα > Recall that ZZ = {zz,, zz,=nndd,.. zz dd=,=nn } =< zz dd, dd {1 }, {1 } > PP (ZZ Θ) = ( pp(zz dd, θθ dd )) (2) Now since zz dd, ~ CCCCCC(θθ dd ) We have pp(zz dd, θθ dd ) = (θθ dd,kk ) xx kk where for a given word,, out of {xx, xx kk= } exactly one of xx kk = 1 and rest all are zero, i.e., xx kk = 0. This follows directly from the categorical distribution because we are sampling a single topic for the tth word in document dd. The sampled topic k has the probability of θθ dd,kk. Thus, NN dd pp(zz dd, θθ dd ) = ( (θθ dd,kk ) xx kk) = (θθ dd,kk ) xx kk (3) as yy aa yy bb = yy aa+bb where yy = θθ dd,kk Now, i.e., xx kk = # of words in document dd that were assigned to topic kk. Let us denote this count by CC(dd, kk) oooo CC dd kk, CC(dd, kk) = CC dd kk = xx kk Continuing from (3), we get NN dd pp(zz dd, θθ dd ). = (θθ dd,kk ) xx kk Substituting (4) in (2), we get = (θθ dd,kk ) CC(dd,kk) (4) PP (ZZ Θ) = ( pp(zz dd, θθ dd )) = ( (θθ dd,kk ) CC(dd,kk) ) (5) Using (5) and (1), we get II 2 = PP (ZZ) = PP (ZZ Θ)PP (Θ)ddΘ = [ ( (θθ dd,kk ) CC(dd,kk) )][ ( 1 ((θθ BB(αα) dd,kk )αα 1 ) )]ddθ (6) Since, the document topic distributions, θθ dd are independent of each other, we can group as follows: = 1 [ ( ((θθ BB(αα) dd,kk )αα 1+CC(dd,kk) ))ddθθ dd ] Using identity A, we get (7) ( ((θθ dd,kk ) αα 1+CC(dd,kk) ))ddθθ dd = BB(αα + CC dd ) where CC dd =< CC dd, CC kk=2 dd, CC kk= dd >
3 And BB(αα + CC dd ) = Γ(αα kk +CC dd kk ) Γ( (αα kk +CC kk dd )). Continuing, from (7), we get 1 II 2 = PP (ZZ) = PP (ZZ Θ)PP (Θ)ddΘ = BB(αα + CC BB(αα) dd ) (8) Now Simplifying II 1 II 1 = PP (WW ZZ) = (PP (WW ZZ, Φ)PP (Φ))ddΦ PP (Φ) = PP (φφ kk ββ) Since φφ kk ~ (ββ) PP (Φ) = ( 1 ((φφ BB(ββ) kk,vv )ββ 1 ) ) (9) PP (WW ZZ, Φ) = ( ( pp(ww dd, φφ kk ))) (10) Since ww dd, ~ CCCCCC(φφ kk ) pp(ww dd, φφ kk ) = (φφ kk,vv ) xx dd, vv where for a given word,, in doc dd out of {xx dd,, xx vv= } exactly one of xx dd, vv = 1 and rest all are zero, i.e., xx dd, vv = 0 as we are drawing a word vv from the topic word distribution, φφ kk with probability φφ kk,vv Continuing from (10) and substituting pp(ww dd, φφ kk ) = PP (WW ZZ, Φ) = ( ( (φφ kk,vv ) xx vv dd, = (φφ kk,vv ) dd xx vv dd, Now, xx dd vv CC(kk, vv) = CC kk vv = xx vv dd, dd Thus, we have dd, )) (φφ kk,vv ) xx dd, vv = # of times word vv was assigned to topic kk. Let this count be denoted by CC kk vv dd, PP (WW ZZ, Φ) = (φφ kk,vv ) CC(kk,vv) Using (9) and (11), we get (11) II 1 = PP (WW ZZ) = (PP (WW ZZ, Φ)PP (Φ))ddΦ = ( (φφ kk,vv ) CC(kk,vv) ) ( ( 1 BB(ββ) ((φφ kk,vv )ββ 1 ) )) ddφ Noting that topic distributions, φφ kk are independent and grouping the terms = 1 BB(ββ) ( (φφ kk,vv )ββ+ CC(kk,vv) 1 ddφφ kk ) The integral simplifies to BB(ββ + CC kk ) where CC kk =< CC kk, CC kk vv=2, CC kk vv= >. Thus,
4 II 1 = PP (WW ZZ) = (PP (WW ZZ, Φ)PP (Φ))ddΦ = 1 BB(ββ + CC BB(ββ) kk ) (12) Thus, using (8) and (12), the full joint distribution of the model can be written as PP (WW, ZZ; αα; ββ) = PP (WW ZZ)PP (ZZ) = II 1 II 2 = [ 1 BB(αα + CC BB(αα) dd ) ][ 1 BB(ββ + CC BB(ββ) kk ) ] (13) III. The Posterior on θθ and φφ (A) Computing the posterior distribution for θθ dd having observed topic assignments, zz dd,nn in document dd. Prior: θθ dd αα ~ (αα). Prior probability: PP (θθ dd ) = 1 ((θθ BB(αα) dd,kk )αα 1 ) Likelihood: zz dd, θθ dd ~ CCCCCC(θθ dd ). NN Likelihood of this document given θθ dd : PP (ZZ dd θθ dd ) dd PP (zz dd, θθ dd ) Posterior on θθ dd using Bayes theorem: i.e., Posterior on θθ dd ZZ dd ~ (αα + CC dd ) = (θθ dd,kk ) CC(dd,kk) PP (ZZ PP (θθ dd ZZ dd ) = dd θθ dd )PP (θθ dd ) (θθ (PP (ZZ dd θθ dd )PP (θθ dd ))ddθθ dd,kk ) CC(dd,kk)+αα 1 dd (using (4)) which is nothing but proportional to the hllllll(αα + CC dd ). Here we see conjugate priors in action. Since Dirichlet is the conjugate prior of Categorical distribution, the posterior also takes the form of the prior, i.e., another Dirichlet with added pseudocounts. In fact one can directly use the properties of conjugate priors to arrive at θθ dd ZZ dd ~ (αα + CC dd ). Using the properties of the Dirichlet distribution, one can easily obtain the expected value of the probability mass associated to each topic in the document d as follows: (B) Computing the posterior distribution for φφ kk EE[θθ dd,kk ] = θθ dd,kk = αα+cc(dd,kk) (αα+cc(dd,kk)) (14) Using the fact that the Dirichlet is conjugate to the categorical distribution used for word emission process, we can arrive at φφ kk WW kk ~ (ββ + CC kk ). Thus, EE[φφ kk,vv ] = φφ kk,vv = ββ+cc(kk,vv) (ββ+cc(kk,vv)) (15) IV. Gibbs sampler Let ZZ = {zz dd,nn } = {zz,, zz,=nn1, zz dd=,=nn } be a NN ii dimensional random vector, i.e., ZZ denotes the collection of all latent topic variables, zz dd,nn corresponding to all words in all documents. Also let us posit a Markov chain XX = < ZZ (0), ZZ (1), ZZ (2),. ZZ (NN IIIIIIII ) > over the data and the model, whose stationary distribution converges to the posterior on distribution of ZZ.
5 Also let ZZ = {zz dd,nn } = {zz ii } be denoted by single subscript of ease of notation. For a given token/word, ii = (dd, ), i.e., the word at document dd, the Gibbs conditional (sampling distribution) for its latent topic zz ii can be constructed as follows: PP (zz ii = kk ZZ, WW) = PP (ZZ,WW ) PP (ZZ,WW ) = PP (WW ZZ)PP (ZZ) PP (WW ZZ )PP (ZZ )pp(ww ii ) PP (WW ZZ)PP (ZZ) PP (WW ZZ )PP (ZZ ) (16) where ZZ = {zz ii, ZZ } and WW = {ww ii, WW }. Equation (16) gives us the probability that the latent topic variable zz ii at ii = (dd, ) is assigned to topic kk {1 } having observed all other topic assignments and words except ww ii. Also for subsequent steps the subscript denotes all counts/variables/functional values upon discounting (or not accounting) the token at ww ii Expanding the sampling distribution using (13) we get: PP (zz ii = kk ZZ, WW, ww ii ) [ BB(αα+CC dd ) BB(αα+CC dd ) ] [ BB(ββ+CC kk ) BB(ββ+CC kk ) PP(WW ZZ)PP(ZZ) PP(WW ZZ )PP(ZZ [( BB(αα+CC dd ) )( BB(ββ+CC kk ))] ) [( BB(αα+CC dd ))( BB(ββ+CC kk ) ] (17) Thus, the Gibbs sampling distribution for pp(zz ii = kk) says that it is proportional to the full joint distribution of the model divided by the joint considering the token/word, ww ii and its associated topical assignment did not exist in our data/model. Observing that αα remains fixed, and ii corresponds to the topic and words, {zz ii, ww ii = (dd, )} at some document dd and some position in that document, we can further simplify the first term in (17) as [ BB(αα+CC dd ) BB(αα+CC dd ) ] = [ BB(αα+CC dd=dd ) (BB(αα+CC dd=dd )) ] = BB(αα+CC dd=dd ) BB(αα+[CC dd=dd ] ) (18) As for all documents other than dd the numerator and denominator remain exactly the same and cancel out. Now let us see what changes happen in the count vector CC dd=dd with or without including the term/word at ww ii = (dd, ) whose latent topic assignment is zz ii = kk. We recall that CC dd =< CC dd, CC dd kk=2, CC dd kk= > and CC dd kk = CC(dd, kk) # of words in document dd that were assigned to topic kk. Hence, we can write CC(dd, kk) = { [CC(dd, kk)] ; kk kk [CC(dd, kk)] + 1 ; kk = kk (19) Expanding (18) using the definition of BB(αα) = Γ(αα ii ), we get Γ( αα ii ) BB(αα+CC dd=dd ) BB(αα+[CC dd=dd ] ) = Γ(αα+CC(dd,kk)) Γ(αα+CC(dd,kk) ) Γ( (αα+cc(dd,kk) )) Γ( (αα+cc(dd,kk))) Using the result in (19) this further simplifies to (upon canceling out all non kk terms) = Γ(αα + CC(dd, kk )) Γ(αα + CC(dd, kk ) ) Γ ( (αα + CC(dd, kk) \kk ) + [αα + CC(dd, kk ) ]) Γ ( (αα + CC(dd, kk)) + [αα + CC(dd, kk )]) \kk = Γ(αα + CC(dd, kk ) + 1) Γ(αα + CC(dd, kk ) ) )] Γ ( (αα + CC(dd, kk) ) + [αα + CC(dd, kk ) \kk ]) Γ ( (αα + CC(dd, kk) ) + [αα + CC(dd, kk ) \kk + 1])
6 Using the identity Γ(xx + 1) = xxγ(xx), we get BB(αα + CC dd=dd ) BB(αα + [CC dd=dd ] ) = αα + CC(dd, kk ) (αα + CC(dd, kk) ) In a similar way, one can also simply the second term in (17) as follows: BB(ββ + CC kk) BB(ββ + CC kk ) BB (αα + CC = kk=kk ) (BB (αα + CC kk=kk )) = ββ + CC (kk, vv ) (ββ + CC (kk, vv ) ) Where vv refers to the vocabulary token which is assigned to ww ii. We can thus simplify the full Gibbs conditionals as follows: PP (zz ii = kk αα + CC(dd, kk ) ZZ, WW, ww ii ) [ ββ + CC(kk, vv ) ] [ ] (αα + CC(dd, kk) ) (ββ + CC(kk, vv ) ) The full algorithm for Gibbs sampling is as follows (from Fig 8, [Heinrich, 2008]). There are some minor notation changes which are noted below. MM; CC dd kk nn mm (kk) ; CC kk vv nn kk (tt) ; nn mm = nn mm (kk) kk ; nn mm = nn kk (tt) tt
Mathematics Ext 2. HSC 2014 Solutions. Suite 403, 410 Elizabeth St, Surry Hills NSW 2010 keystoneeducation.com.
Mathematics Ext HSC 4 Solutions Suite 43, 4 Elizabeth St, Surry Hills NSW info@keystoneeducation.com.au keystoneeducation.com.au Mathematics Extension : HSC 4 Solutions Contents Multiple Choice... 3 Question...
More informationWork, Energy, and Power. Chapter 6 of Essential University Physics, Richard Wolfson, 3 rd Edition
Work, Energy, and Power Chapter 6 of Essential University Physics, Richard Wolfson, 3 rd Edition 1 With the knowledge we got so far, we can handle the situation on the left but not the one on the right.
More informationIntroduction to Bayesian inference
Introduction to Bayesian inference Thomas Alexander Brouwer University of Cambridge tab43@cam.ac.uk 17 November 2015 Probabilistic models Describe how data was generated using probability distributions
More informationVariations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra
Variations ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra Last Time Probability Density Functions Normal Distribution Expectation / Expectation of a function Independence Uncorrelated
More informationP.3 Division of Polynomials
00 section P3 P.3 Division of Polynomials In this section we will discuss dividing polynomials. The result of division of polynomials is not always a polynomial. For example, xx + 1 divided by xx becomes
More informationRotational Motion. Chapter 10 of Essential University Physics, Richard Wolfson, 3 rd Edition
Rotational Motion Chapter 10 of Essential University Physics, Richard Wolfson, 3 rd Edition 1 We ll look for a way to describe the combined (rotational) motion 2 Angle Measurements θθ ss rr rrrrrrrrrrrrrr
More informationChapter 22 : Electric potential
Chapter 22 : Electric potential What is electric potential? How does it relate to potential energy? How does it relate to electric field? Some simple applications What does it mean when it says 1.5 Volts
More informationSecondary 3H Unit = 1 = 7. Lesson 3.3 Worksheet. Simplify: Lesson 3.6 Worksheet
Secondary H Unit Lesson Worksheet Simplify: mm + 2 mm 2 4 mm+6 mm + 2 mm 2 mm 20 mm+4 5 2 9+20 2 0+25 4 +2 2 + 2 8 2 6 5. 2 yy 2 + yy 6. +2 + 5 2 2 2 0 Lesson 6 Worksheet List all asymptotes, holes and
More informationEvaluation Methods for Topic Models
University of Massachusetts Amherst wallach@cs.umass.edu April 13, 2009 Joint work with Iain Murray, Ruslan Salakhutdinov and David Mimno Statistical Topic Models Useful for analyzing large, unstructured
More informationLecture 3. STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher
Lecture 3 STAT161/261 Introduction to Pattern Recognition and Machine Learning Spring 2018 Prof. Allie Fletcher Previous lectures What is machine learning? Objectives of machine learning Supervised and
More informationIntegrating Rational functions by the Method of Partial fraction Decomposition. Antony L. Foster
Integrating Rational functions by the Method of Partial fraction Decomposition By Antony L. Foster At times, especially in calculus, it is necessary, it is necessary to express a fraction as the sum of
More information(2) Orbital angular momentum
(2) Orbital angular momentum Consider SS = 0 and LL = rr pp, where pp is the canonical momentum Note: SS and LL are generators for different parts of the wave function. Note: from AA BB ii = εε iiiiii
More informationTopic Modelling and Latent Dirichlet Allocation
Topic Modelling and Latent Dirichlet Allocation Stephen Clark (with thanks to Mark Gales for some of the slides) Lent 2013 Machine Learning for Language Processing: Lecture 7 MPhil in Advanced Computer
More informationQuantum Mechanics. An essential theory to understand properties of matter and light. Chemical Electronic Magnetic Thermal Optical Etc.
Quantum Mechanics An essential theory to understand properties of matter and light. Chemical Electronic Magnetic Thermal Optical Etc. Fall 2018 Prof. Sergio B. Mendes 1 CHAPTER 3 Experimental Basis of
More informationLecture No. 1 Introduction to Method of Weighted Residuals. Solve the differential equation L (u) = p(x) in V where L is a differential operator
Lecture No. 1 Introduction to Method of Weighted Residuals Solve the differential equation L (u) = p(x) in V where L is a differential operator with boundary conditions S(u) = g(x) on Γ where S is a differential
More information10.1 Three Dimensional Space
Math 172 Chapter 10A notes Page 1 of 12 10.1 Three Dimensional Space 2D space 0 xx.. xx-, 0 yy yy-, PP(xx, yy) [Fig. 1] Point PP represented by (xx, yy), an ordered pair of real nos. Set of all ordered
More informationLatent Dirichlet Allocation (LDA)
Latent Dirichlet Allocation (LDA) D. Blei, A. Ng, and M. Jordan. Journal of Machine Learning Research, 3:993-1022, January 2003. Following slides borrowed ant then heavily modified from: Jonathan Huang
More informationGenerative Clustering, Topic Modeling, & Bayesian Inference
Generative Clustering, Topic Modeling, & Bayesian Inference INFO-4604, Applied Machine Learning University of Colorado Boulder December 12-14, 2017 Prof. Michael Paul Unsupervised Naïve Bayes Last week
More information10.4 The Cross Product
Math 172 Chapter 10B notes Page 1 of 9 10.4 The Cross Product The cross product, or vector product, is defined in 3 dimensions only. Let aa = aa 1, aa 2, aa 3 bb = bb 1, bb 2, bb 3 then aa bb = aa 2 bb
More informationCHAPTER 5 Wave Properties of Matter and Quantum Mechanics I
CHAPTER 5 Wave Properties of Matter and Quantum Mechanics I 1 5.1 X-Ray Scattering 5.2 De Broglie Waves 5.3 Electron Scattering 5.4 Wave Motion 5.5 Waves or Particles 5.6 Uncertainty Principle Topics 5.7
More informationTECHNICAL NOTE AUTOMATIC GENERATION OF POINT SPRING SUPPORTS BASED ON DEFINED SOIL PROFILES AND COLUMN-FOOTING PROPERTIES
COMPUTERS AND STRUCTURES, INC., FEBRUARY 2016 TECHNICAL NOTE AUTOMATIC GENERATION OF POINT SPRING SUPPORTS BASED ON DEFINED SOIL PROFILES AND COLUMN-FOOTING PROPERTIES Introduction This technical note
More informationA new procedure for sensitivity testing with two stress factors
A new procedure for sensitivity testing with two stress factors C.F. Jeff Wu Georgia Institute of Technology Sensitivity testing : problem formulation. Review of the 3pod (3-phase optimal design) procedure
More informationWave Motion. Chapter 14 of Essential University Physics, Richard Wolfson, 3 rd Edition
Wave Motion Chapter 14 of Essential University Physics, Richard Wolfson, 3 rd Edition 1 Waves: propagation of energy, not particles 2 Longitudinal Waves: disturbance is along the direction of wave propagation
More information(1) Introduction: a new basis set
() Introduction: a new basis set In scattering, we are solving the S eq. for arbitrary VV in integral form We look for solutions to unbound states: certain boundary conditions (EE > 0, plane and spherical
More informationLatent Dirichlet Allocation (LDA)
Latent Dirichlet Allocation (LDA) A review of topic modeling and customer interactions application 3/11/2015 1 Agenda Agenda Items 1 What is topic modeling? Intro Text Mining & Pre-Processing Natural Language
More information2. Similarly, 8 following generalization: The denominator of the rational exponent is the index of the radical.
RD. Rational Exponents Rational Exponents In sections P and RT, we reviewed properties of powers with natural and integral exponents. All of these properties hold for real exponents as well. In this section,
More informationLanguage Models. Hongning Wang
Language Models Hongning Wang CS@UVa Notion of Relevance Relevance (Rep(q), Rep(d)) Similarity P(r1 q,d) r {0,1} Probability of Relevance P(d q) or P(q d) Probabilistic inference Different rep & similarity
More informationCHAPTER 4 Structure of the Atom
CHAPTER 4 Structure of the Atom Fall 2018 Prof. Sergio B. Mendes 1 Topics 4.1 The Atomic Models of Thomson and Rutherford 4.2 Rutherford Scattering 4.3 The Classic Atomic Model 4.4 The Bohr Model of the
More informationGeneral Strong Polarization
General Strong Polarization Madhu Sudan Harvard University Joint work with Jaroslaw Blasiok (Harvard), Venkatesan Gurswami (CMU), Preetum Nakkiran (Harvard) and Atri Rudra (Buffalo) May 1, 018 G.Tech:
More informationGeneral Strong Polarization
General Strong Polarization Madhu Sudan Harvard University Joint work with Jaroslaw Blasiok (Harvard), Venkatesan Gurswami (CMU), Preetum Nakkiran (Harvard) and Atri Rudra (Buffalo) December 4, 2017 IAS:
More informationText Mining for Economics and Finance Latent Dirichlet Allocation
Text Mining for Economics and Finance Latent Dirichlet Allocation Stephen Hansen Text Mining Lecture 5 1 / 45 Introduction Recall we are interested in mixed-membership modeling, but that the plsi model
More informationExam Programme VWO Mathematics A
Exam Programme VWO Mathematics A The exam. The exam programme recognizes the following domains: Domain A Domain B Domain C Domain D Domain E Mathematical skills Algebra and systematic counting Relationships
More informationECE 6540, Lecture 06 Sufficient Statistics & Complete Statistics Variations
ECE 6540, Lecture 06 Sufficient Statistics & Complete Statistics Variations Last Time Minimum Variance Unbiased Estimators Sufficient Statistics Proving t = T(x) is sufficient Neyman-Fischer Factorization
More informationPhysics 371 Spring 2017 Prof. Anlage Review
Physics 71 Spring 2017 Prof. Anlage Review Special Relativity Inertial vs. non-inertial reference frames Galilean relativity: Galilean transformation for relative motion along the xx xx direction: xx =
More informationPHY103A: Lecture # 4
Semester II, 2017-18 Department of Physics, IIT Kanpur PHY103A: Lecture # 4 (Text Book: Intro to Electrodynamics by Griffiths, 3 rd Ed.) Anand Kumar Jha 10-Jan-2018 Notes The Solutions to HW # 1 have been
More informationSECTION 8: ROOT-LOCUS ANALYSIS. ESE 499 Feedback Control Systems
SECTION 8: ROOT-LOCUS ANALYSIS ESE 499 Feedback Control Systems 2 Introduction Introduction 3 Consider a general feedback system: Closed-loop transfer function is KKKK ss TT ss = 1 + KKKK ss HH ss GG ss
More informationModule 7 (Lecture 25) RETAINING WALLS
Module 7 (Lecture 25) RETAINING WALLS Topics Check for Bearing Capacity Failure Example Factor of Safety Against Overturning Factor of Safety Against Sliding Factor of Safety Against Bearing Capacity Failure
More informationLecture 22 Highlights Phys 402
Lecture 22 Highlights Phys 402 Scattering experiments are one of the most important ways to gain an understanding of the microscopic world that is described by quantum mechanics. The idea is to take a
More informationSECTION 7: FAULT ANALYSIS. ESE 470 Energy Distribution Systems
SECTION 7: FAULT ANALYSIS ESE 470 Energy Distribution Systems 2 Introduction Power System Faults 3 Faults in three-phase power systems are short circuits Line-to-ground Line-to-line Result in the flow
More informationModule 7 (Lecture 27) RETAINING WALLS
Module 7 (Lecture 27) RETAINING WALLS Topics 1.1 RETAINING WALLS WITH METALLIC STRIP REINFORCEMENT Calculation of Active Horizontal and vertical Pressure Tie Force Factor of Safety Against Tie Failure
More informationLatent Dirichlet Allocation
Latent Dirichlet Allocation 1 Directed Graphical Models William W. Cohen Machine Learning 10-601 2 DGMs: The Burglar Alarm example Node ~ random variable Burglar Earthquake Arcs define form of probability
More informationGrover s algorithm. We want to find aa. Search in an unordered database. QC oracle (as usual) Usual trick
Grover s algorithm Search in an unordered database Example: phonebook, need to find a person from a phone number Actually, something else, like hard (e.g., NP-complete) problem 0, xx aa Black box ff xx
More informationSECTION 5: POWER FLOW. ESE 470 Energy Distribution Systems
SECTION 5: POWER FLOW ESE 470 Energy Distribution Systems 2 Introduction Nodal Analysis 3 Consider the following circuit Three voltage sources VV sss, VV sss, VV sss Generic branch impedances Could be
More informationMathematics Paper 2 Grade 12 Preliminary Examination 2017
Mathematics Paper 2 Grade 12 Preliminary Examination 2017 DURATION: 180 min EXAMINER: R. Obermeyer MARKS: 150 MODERATOR: A. Janisch Date: 15 September 2017 External Moderator: I. Atteridge INSTRUCTIONS:
More informationContinuous Random Variables
Continuous Random Variables Page Outline Continuous random variables and density Common continuous random variables Moment generating function Seeking a Density Page A continuous random variable has an
More informationLesson 24: Using the Quadratic Formula,
, b ± b 4ac x = a Opening Exercise 1. Examine the two equation below and discuss what is the most efficient way to solve each one. A. 4xx + 5xx + 3 = xx 3xx B. cc 14 = 5cc. Solve each equation with the
More informationRevision : Thermodynamics
Revision : Thermodynamics Formula sheet Formula sheet Formula sheet Thermodynamics key facts (1/9) Heat is an energy [measured in JJ] which flows from high to low temperature When two bodies are in thermal
More informationHaar Basis Wavelets and Morlet Wavelets
Haar Basis Wavelets and Morlet Wavelets September 9 th, 05 Professor Davi Geiger. The Haar transform, which is one of the earliest transform functions proposed, was proposed in 90 by a Hungarian mathematician
More informationRadiation. Lecture40: Electromagnetic Theory. Professor D. K. Ghosh, Physics Department, I.I.T., Bombay
Radiation Zone Approximation We had seen that the expression for the vector potential for a localized cuent distribution is given by AA (xx, tt) = μμ 4ππ ee iiiiii dd xx eeiiii xx xx xx xx JJ (xx ) In
More information16 : Approximate Inference: Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models 10-708, Spring 2017 16 : Approximate Inference: Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Yuan Yang, Chao-Ming Yen 1 Introduction As the target distribution
More informationReview for Exam Hyunse Yoon, Ph.D. Adjunct Assistant Professor Department of Mechanical Engineering, University of Iowa
Review for Exam2 11. 13. 2015 Hyunse Yoon, Ph.D. Adjunct Assistant Professor Department of Mechanical Engineering, University of Iowa Assistant Research Scientist IIHR-Hydroscience & Engineering, University
More informationUnit 6 Note Packet List of topics for this unit/assignment tracker Date Topic Assignment & Due Date Absolute Value Transformations Day 1
Name: Period: Unit 6 Note Packet List of topics for this unit/assignment tracker Date Topic Assignment & Due Date Absolute Value Transformations Day 1 Absolute Value Transformations Day 2 Graphing Equations
More informationLecture 11. Kernel Methods
Lecture 11. Kernel Methods COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturer: Andrey Kan Copyright: University of Melbourne This lecture The kernel trick Efficient computation of a dot product
More informationLesson 12: Solving Equations
Exploratory Exercises 1. Alonzo was correct when he said the following equations had the same solution set. Discuss with your partner why Alonzo was correct. (xx 1)(xx + 3) = 17 + xx (xx 1)(xx + 3) = xx
More informationMath 171 Spring 2017 Final Exam. Problem Worth
Math 171 Spring 2017 Final Exam Problem 1 2 3 4 5 6 7 8 9 10 11 Worth 9 6 6 5 9 8 5 8 8 8 10 12 13 14 15 16 17 18 19 20 21 22 Total 8 5 5 6 6 8 6 6 6 6 6 150 Last Name: First Name: Student ID: Section:
More informationA Smoothing Latent Dirichlet Allocation Model for Text Classification Based on Tolerance Rough Set
Chapter 5 A Smoothing Latent Dirichlet Allocation Model for Text Classification Based on Tolerance Rough Set As a kind of language model, Latent Dirichlet Allocation (LDA) should be smoothed also. In this
More informationNon-Parametric Bayes
Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian
More informationUniversity of North Georgia Department of Mathematics
Instructor: Berhanu Kidane Course: Precalculus Math 13 University of North Georgia Department of Mathematics Text Books: For this course we use free online resources: See the folder Educational Resources
More informationGibbs Sampling. Héctor Corrada Bravo. University of Maryland, College Park, USA CMSC 644:
Gibbs Sampling Héctor Corrada Bravo University of Maryland, College Park, USA CMSC 644: 2019 03 27 Latent semantic analysis Documents as mixtures of topics (Hoffman 1999) 1 / 60 Latent semantic analysis
More informationQuantities that involve BOTH a magnitude and a direction are called vectors. Quantities that involve magnitude, but no direction are called scalars.
Math 1330 Vectors Quantities that involve BOTH a magnitude and a direction are called vectors. Quantities that involve magnitude, but no direction are called scalars. Example: You are driving south at
More informationNational 5 Mathematics. Practice Paper E. Worked Solutions
National 5 Mathematics Practice Paper E Worked Solutions Paper One: Non-Calculator Copyright www.national5maths.co.uk 2015. All rights reserved. SQA Past Papers & Specimen Papers Working through SQA Past
More informationLesson 13: More Factoring Strategies for Quadratic Equations & Expressions
: More Factoring Strategies for Quadratic Equations & Expressions Opening Exploration Looking for Signs In the last lesson, we focused on quadratic equations where all the terms were positive. Juan s examples
More informationRational Expressions and Functions
Rational Expressions and Functions In the previous two chapters we discussed algebraic expressions, equations, and functions related to polynomials. In this chapter, we will examine a broader category
More informationLesson 7: Algebraic Expressions The Commutative and Associative Properties
Algebraic Expressions The Commutative and Associative Properties Classwork Exercise 1 Suzy draws the following picture to represent the sum 3 + 4: Ben looks at this picture from the opposite side of the
More informationIntroduction to time series econometrics and VARs. Tom Holden PhD Macroeconomics, Semester 2
Introduction to time series econometrics and VARs Tom Holden http://www.tholden.org/ PhD Macroeconomics, Semester 2 Outline of today s talk Discussion of the structure of the course. Some basics: Vector
More informationM.5 Modeling the Effect of Functional Responses
M.5 Modeling the Effect of Functional Responses The functional response is referred to the predation rate as a function of the number of prey per predator. It is recognized that as the number of prey increases,
More informationThornton & Rex, 4th ed. Fall 2018 Prof. Sergio B. Mendes 1
Modern Physics for Scientists and Engineers Thornton & Rex, 4th ed. Fall 2018 Prof. Sergio B. Mendes 1 CHAPTER 1 The Birth of Modern Physics Fall 2018 Prof. Sergio B. Mendes 2 Topics 1) Classical Physics
More informationF.4 Solving Polynomial Equations and Applications of Factoring
section F4 243 F.4 Zero-Product Property Many application problems involve solving polynomial equations. In Chapter L, we studied methods for solving linear, or first-degree, equations. Solving higher
More informationLecturer: David Blei Lecture #3 Scribes: Jordan Boyd-Graber and Francisco Pereira October 1, 2007
COS 597C: Bayesian Nonparametrics Lecturer: David Blei Lecture # Scribes: Jordan Boyd-Graber and Francisco Pereira October, 7 Gibbs Sampling with a DP First, let s recapitulate the model that we re using.
More informationQuadratic Equations and Functions
50 Quadratic Equations and Functions In this chapter, we discuss various ways of solving quadratic equations, aaxx 2 + bbbb + cc 0, including equations quadratic in form, such as xx 2 + xx 1 20 0, and
More informationCHAPTER 2 Special Theory of Relativity
CHAPTER 2 Special Theory of Relativity Fall 2018 Prof. Sergio B. Mendes 1 Topics 2.0 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 2.10 2.11 2.12 2.13 2.14 Inertial Frames of Reference Conceptual and Experimental
More informationPhotons in the universe. Indian Institute of Technology Ropar
Photons in the universe Photons in the universe Element production on the sun Spectral lines of hydrogen absorption spectrum absorption hydrogen gas Hydrogen emission spectrum Element production on the
More informationGaussian Mixture Model
Case Study : Document Retrieval MAP EM, Latent Dirichlet Allocation, Gibbs Sampling Machine Learning/Statistics for Big Data CSE599C/STAT59, University of Washington Emily Fox 0 Emily Fox February 5 th,
More informationOutline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution
Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model
More informationSAMPLE COURSE OUTLINE MATHEMATICS METHODS ATAR YEAR 11
SAMPLE COURSE OUTLINE MATHEMATICS METHODS ATAR YEAR 11 Copyright School Curriculum and Standards Authority, 2017 This document apart from any third party copyright material contained in it may be freely
More informationAN INTRODUCTION TO TOPIC MODELS
AN INTRODUCTION TO TOPIC MODELS Michael Paul December 4, 2013 600.465 Natural Language Processing Johns Hopkins University Prof. Jason Eisner Making sense of text Suppose you want to learn something about
More informationWorksheets for GCSE Mathematics. Quadratics. mr-mathematics.com Maths Resources for Teachers. Algebra
Worksheets for GCSE Mathematics Quadratics mr-mathematics.com Maths Resources for Teachers Algebra Quadratics Worksheets Contents Differentiated Independent Learning Worksheets Solving x + bx + c by factorisation
More information13: Variational inference II
10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational
More informationStudy Notes on the Latent Dirichlet Allocation
Study Notes on the Latent Dirichlet Allocation Xugang Ye 1. Model Framework A word is an element of dictionary {1,,}. A document is represented by a sequence of words: =(,, ), {1,,}. A corpus is a collection
More informationLecture 3. Linear Regression
Lecture 3. Linear Regression COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturer: Andrey Kan Copyright: University of Melbourne Weeks 2 to 8 inclusive Lecturer: Andrey Kan MS: Moscow, PhD:
More informationCDS 101/110: Lecture 6.2 Transfer Functions
CDS 11/11: Lecture 6.2 Transfer Functions November 2, 216 Goals: Continued study of Transfer functions Review Laplace Transform Block Diagram Algebra Bode Plot Intro Reading: Åström and Murray, Feedback
More information(1) Correspondence of the density matrix to traditional method
(1) Correspondence of the density matrix to traditional method New method (with the density matrix) Traditional method (from thermal physics courses) ZZ = TTTT ρρ = EE ρρ EE = dddd xx ρρ xx ii FF = UU
More informationDSGE (3): BASIC NEOCLASSICAL MODELS (2): APPROACHES OF SOLUTIONS
DSGE (3): Stochastic neoclassical growth models (2): how to solve? 27 DSGE (3): BASIC NEOCLASSICAL MODELS (2): APPROACHES OF SOLUTIONS. Recap of basic neoclassical growth models and log-linearisation 2.
More informationSpecialist Mathematics 2019 v1.2
Examination (15%) This sample has been compiled by the QCAA to assist and support teachers in planning and developing assessment instruments for individual school settings. The examination must ensure
More informationA Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter
A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter Murat Akcakaya Department of Electrical and Computer Engineering University of Pittsburgh Email: akcakaya@pitt.edu Satyabrata
More informationSpecialist Mathematics 2019 v1.2
Examination This sample has been compiled by the QCAA to assist and support teachers in planning and developing assessment instruments for individual school settings. Schools develop internal assessments
More informationCS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 7: Learning Fully Observed BNs Theo Rekatsinas 1 Exponential family: a basic building block For a numeric random variable X p(x ) =h(x)exp T T (x) A( ) = 1
More informationSpecialist Mathematics 2019 v1.2
181314 Mensuration circumference of a circle area of a parallelogram CC = ππππ area of a circle AA = ππrr AA = h area of a trapezium AA = 1 ( + )h area of a triangle AA = 1 h total surface area of a cone
More informationControl of Mobile Robots
Control of Mobile Robots Regulation and trajectory tracking Prof. Luca Bascetta (luca.bascetta@polimi.it) Politecnico di Milano Dipartimento di Elettronica, Informazione e Bioingegneria Organization and
More information10.4 Controller synthesis using discrete-time model Example: comparison of various controllers
10. Digital 10.1 Basic principle of digital control 10.2 Digital PID controllers 10.2.1 A 2DOF continuous-time PID controller 10.2.2 Discretisation of PID controllers 10.2.3 Implementation and tuning 10.3
More informationCS249: ADVANCED DATA MINING
CS249: ADVANCED DATA MINING Vector Data: Clustering: Part II Instructor: Yizhou Sun yzsun@cs.ucla.edu May 3, 2017 Methods to Learn: Last Lecture Classification Clustering Vector Data Text Data Recommender
More informationLecture # 31. Questions of Marks 3. Question: Solution:
Lecture # 31 Given XY = 400, X = 5, Y = 4, S = 4, S = 3, n = 15. Compute the coefficient of correlation between XX and YY. r =0.55 X Y Determine whether two variables XX and YY are correlated or uncorrelated
More informationUnit II. Page 1 of 12
Unit II (1) Basic Terminology: (i) Exhaustive Events: A set of events is said to be exhaustive, if it includes all the possible events. For example, in tossing a coin there are two exhaustive cases either
More informationF.3 Special Factoring and a General Strategy of Factoring
F.3 Special Factoring and a General Strategy of Factoring Difference of Squares section F4 233 Recall that in Section P2, we considered formulas that provide a shortcut for finding special products, such
More informationReview for Exam Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa
57:020 Fluids Mechanics Fall2013 1 Review for Exam3 12. 11. 2013 Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa 57:020 Fluids Mechanics Fall2013 2 Chapter
More informationAngular Momentum, Electromagnetic Waves
Angular Momentum, Electromagnetic Waves Lecture33: Electromagnetic Theory Professor D. K. Ghosh, Physics Department, I.I.T., Bombay As before, we keep in view the four Maxwell s equations for all our discussions.
More informationChemical Engineering 412
Chemical Engineering 412 Introductory Nuclear Engineering Lecture 7 Nuclear Decay Behaviors Spiritual Thought Sooner or later, I believe that all of us experience times when the very fabric of our world
More informationME5286 Robotics Spring 2017 Quiz 2
Page 1 of 5 ME5286 Robotics Spring 2017 Quiz 2 Total Points: 30 You are responsible for following these instructions. Please take a minute and read them completely. 1. Put your name on this page, any other
More informationAnalyzing and improving the reproducibility of shots taken by a soccer robot
Analyzing and improving the reproducibility of shots taken by a soccer robot Overview Introduction Capacitor analysis Lever positioning analysis Validation Conclusion and recommendations Introduction Shots
More informationON A CONTINUED FRACTION IDENTITY FROM RAMANUJAN S NOTEBOOK
Asian Journal of Current Engineering and Maths 3: (04) 39-399. Contents lists available at www.innovativejournal.in ASIAN JOURNAL OF CURRENT ENGINEERING AND MATHS Journal homepage: http://www.innovativejournal.in/index.php/ajcem
More information