Lecture 13b: Latent Semantic Analysis
|
|
- Grant Bridges
- 5 years ago
- Views:
Transcription
1 Lecture 13b: Latent Semantc Analyss S540 4/19/18 Materal borrowe (wth permsson) from Vasleos Hatzvassloglou & Evmara Terz. Mstakes are mne. Announcement Proect # test problems release Input fles are complete states Output fles are thngs that must be true n the fnal state All outputs are possble Input/output #1 (reorerng) Start wth re on bottom, black on top En wth black on bottom, re on top Lots of steps to acheve n subgoal Input/Output # (ang subgoal) Goal: black on re Twst: move black frst, even though re s on bottom & must move Announcements (II) Input/output #3: Stack ranomly scattere blocks Lots of solutons (many?s n output fle) Large search space Announcements (III) Proect # Paper : Due Tuesay Four pages (no longer) Three parts Introucton : escrbe your motvaton an program Why you esgn t the way you How s t suppose to work Precte lmtatons an rsks Performance How well oes t work on the problems gven? Report relevant quanttatve metrcs for your esgn You may supplement wth atonal problems of your own esgn oncluson Do your prectons match the results? If not, why not? Announcements (IV) In-class presentaton : also Tuesay 5 mnutes maxmum (tme) Powerpont or pf sles Emal to me (raper@colostate.eu) the nght before Same 3-part structure as the paper Reang assgnment for next Thursay Thomas Hoffman. Probablstc Latent Semantc Inexng. AM SIGIR Forum, 51(), 017. Homonyms rane A br wth long legs an a long neck A tool use to lft large obects A moton that stretches (e.g. crane your neck) Date A frut of the palm tree A romantc evenng A ay of the month Fol A materal use to wrap thngs (sometmes mae of alumnum) To run a plan or scheme A ramatc fol 1
2 Resolvng Homonyms Part of speech taggng Dstngushes between meanngs wth fferent grammatcal roles E.g. crane/br (noun) vs crane/moton (verb) But all the homonyms on the prevous sle have noun meanngs Drect mofers (aectves/averbs/phrases) 1 ton crane probably not a br Anyone seen the move Rampage? Not always avalable Requre parsng an e-referencng To know whch wor s beng mofe Meanng as Assocaton ontext Dfferent meanngs (wor senses) occur n fferent contexts rane/br often occurs wth fsh, marsh, water, rare, rane/tool often occurs wth constructon, bulng, lft, collapse, Use surrounng text to select wor meanngs Bag of Wors Ignore syntax altogether Treat text as an unorere set of wors Two texts are smlar f ther wors are smlar Problem: two texts may use fferent terms for the same thng We ve gone from homonyms to synonyms How o we ust the smlarty of groups of wors? Assocaton as nformaton alculatng mutual nformaton Gven a ranom varable, entropy s å H ( ) = p( )log p( ) I(, Y ) = åå p(, Y ) p(, Y )log p( ) p( Y ) Mutual nformaton s the reucton n entropy from knowng another varable Base on Kullback Lebler stance D(p q) I(, Y ) = H ( ) - H ( Y ) = H ( Y ) - H ( Y ) Specfc Mutual nformaton hurch an Hanks, 1989; Smaa 1990 Only the 1-1 term P(, Y ) SI (, Y ) = log P( ) P( Y ) Assocaton as contonal probabltes The Dce coeffcent (Dce, 1945) P(, Y ) D(, Y ) = P( ) + P( Y ) Smlar to Jaccar coeffcent
3 feature 4/19/18 Datasets n the form of matrces We are gven n obects an features escrbng the obects. (Each obect has numerc values escrbng t.) Dataset An n-by- matrx A, A shows the mportance of feature for obect. Every row of A represents an obect. Goal 1.Unerstan the structure of the ata, e.g., the unerlyng process generatng the ata..reuce the number of features representng the ata Market basket matrces n customers proucts (e.g., mlk, brea, wne, etc.) A = quantty of -th prouct purchase by the -th customer Fn a subset of the proucts that characterze customer behavor Socal-network matrces groups (e.g., BU group, opera, etc.) Document matrces terms (e.g., theorem, proof, etc.) n users A = parttcpaton of the -th user n the -th group Fn a subset of the groups that accurately clusters socal-network users n ocuments A = frequency of the -th term n the -th ocument Fn a subset of the terms that accurately clusters the ocuments The Sngular Value Decomposton (SVD) Data matrces have n rows (one for each obect) an columns (one for each feature). Rows: vectors n a Euclean space, Two obects are close f the angle between ther corresponng vectors s small. (,x) Obect Obect x feature SVD: Example n (rght) sngular vector 1st (rght) sngular vector Input: - mensonal ponts Output: 1st (rght) sngular vector: recton of maxmal varance, n (rght) sngular vector: recton of maxmal varance, after removng the proecton of the ata along the frst sngular vector. 3
4 Sngular values SVD ecomposton n (rght) sngular vector s 1st (rght) 1 sngular vector s 1 : measures how much of the ata varance s explane by the frst sngular vector. s : measures how much of the ata varance s explane by the secon sngular vector. n x n x l l x l l x U (V): orthogonal matrx contanng the left (rght) sngular vectors of A. S: agonal matrx contanng the sngular values of A: (s 1 s s l ) Exact computaton of the SVD takes O(mn{mn, m n}) tme. The top k left/rght sngular vectors/values can be compute faster usng Lanczos/Arnol methos. 0 0 SVD an Rank-k approxmatons Rank-k approxmatons (A k ) A = U S V T features = sgnfcant sg. sgnfcant n x n x k k x k k x A k s the best U k (V k ): orthogonal matrx contanng the top k left (rght) sngular vectors of A. approxmaton of A S k : agonal matrx contanng the top k sngular values of A obects A k s an approxmaton of A SVD as an optmzaton problem Relatng SVD to PA Fn to mnmze: mn A - A F = å A, k k Frobenus norm: F SVD s a matrx ecomposton algorthm. Apple to the raw matrx, we get:! = #Σ% & When we apply SVD to the covarance matrx, we call t PA!! & = #Σ ' # & There s therefore a secon form of PA:! &! = %Σ ' % & Note: Egenvalues are sngular values square 4
5 The -ecomposton Fn that contans subset of the columns of A to mnmze: mn A - A F = å A, k k Gven t s easy to fn from stanar least squares. However, fnng s now har!!! F Why -ecomposton If A s an obect-feature matrx, then selectng representatve columns s equvalent to selectng representatve features Ths leas to easer nterpretablty; compare to egenfeatures, whch are lnear combnatons of all features. Algorthms for the ecomposton The SVD-base algorthm The greey algorthm The k-means-base algorthm Algorthms for the ecomposton The SVD-base algorthm Do SVD frst Map k columns of A to the left sngular vectors The greey algorthm Greely pck k columns of A that mnmze the error The k-means-base algorthm Fn k centers (by clusterng the columns) Map the k centers to columns of A Dscusson on the ecomposton The vectors n are not orthogonal they o not efne a space It mantans the sparcty of the ata Latent Semantc Analyss How o we use SVD (or, or IA, or ) n NLP? Let s look at the text retreval problem Gven a corpus of ocuments An a new ocument A Orer the ocuments n by smlarty to A Not really. Usually IR s efne as returnng the N most smlar ocuments. But we wll look at retreval ssues later. For the moment, efnng an orerng wll o. 5
6 LSA Step 1: Analyze corpus by SVD A = U S V T LSA Step 1 (cont) The left sngular vectors U map ocuments to concepts Dscar all but the frst K columns of U Assumng orere by the magntue of the sngular vectors The 1st K columns wll map ocuments onto the K maor concepts n your corpus obects features = sgnfcant sg. sgnfcant U k s a compacte verson of the corpus s a ocuments x terms matrx Note: terms s the sze f your lexcon! U k s ocuments x K Much smaller Arguably more meanngful Each row s a vector Each term U k [,] measures how much concept occurs n ocument LSA Step When gven A (the query ocument), compute Auk Ths wll gve a column vector a = AU k The th term measures how much concept u appears n A ompute the sne of the angle between a an every row of U k The closer to 1, the more smlar the ocument s to A!"#$ %, ' = % ) ' % ' LST Step (cont) Faster than you mght thnk Typcally one corpus Many query mages A, gven one at a tme ompute U k once Normalze the rows to have magntue 1. For every query ocument ompute AU k Normalze t to have magntue 1 Now compute (U k )a usng normalze versons. The result s a vector of snes of angles. Dsambguatng Homonyms We starte wth the homonym problem How to select the correct wor sense? A soluton: use LSA ollect every sentence the wor occurs n from a corpus Ths s a set of small ocuments Perform LSA, keepng K sngular vectors K shoul account for 85% of energy For ever new use, fn closest wor sense Avantage: we work for new terms, f you a them to corpus Dsavantage: wor sense s a column number. Lngusts mght stll want to know what t means If ecomposton s use, you can say as use n ocument y 6
ENGI9496 Lecture Notes Multiport Models in Mechanics
ENGI9496 Moellng an Smulaton of Dynamc Systems Mechancs an Mechansms ENGI9496 Lecture Notes Multport Moels n Mechancs (New text Secton 4..3; Secton 9.1 generalzes to 3D moton) Defntons Generalze coornates
More informationLarge-Scale Data-Dependent Kernel Approximation Appendix
Large-Scale Data-Depenent Kernel Approxmaton Appenx Ths appenx presents the atonal etal an proofs assocate wth the man paper [1]. 1 Introucton Let k : R p R p R be a postve efnte translaton nvarant functon
More informationENTROPIC QUESTIONING
ENTROPIC QUESTIONING NACHUM. Introucton Goal. Pck the queston that contrbutes most to fnng a sutable prouct. Iea. Use an nformaton-theoretc measure. Bascs. Entropy (a non-negatve real number) measures
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationChapter 7 Clustering Analysis (1)
Chater 7 Clusterng Analyss () Outlne Cluster Analyss Parttonng Clusterng Herarchcal Clusterng Large Sze Data Clusterng What s Cluster Analyss? Cluster: A collecton of ata obects smlar (or relate) to one
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationCorpora and Statistical Methods Lecture 6. Semantic similarity, vector space models and wordsense disambiguation
Corpora and Statstcal Methods Lecture 6 Semantc smlarty, vector space models and wordsense dsambguaton Part 1 Semantc smlarty Synonymy Dfferent phonologcal/orthographc words hghly related meanngs: sofa
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationUnified Subspace Analysis for Face Recognition
Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More information2. High dimensional data
/8/00. Hgh mensons. Hgh mensonal ata Conser representng a ocument by a vector each component of whch correspons to the number of occurrences of a partcular wor n the ocument. The Englsh language has on
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationHomework Notes Week 7
Homework Notes Week 7 Math 4 Sprng 4 #4 (a Complete the proof n example 5 that s an nner product (the Frobenus nner product on M n n (F In the example propertes (a and (d have already been verfed so we
More informationLecture 4. Instructor: Haipeng Luo
Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationLECTURE 9 CANONICAL CORRELATION ANALYSIS
LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of
More informationLinear Correlation. Many research issues are pursued with nonexperimental studies that seek to establish relationships among 2 or more variables
Lnear Correlaton Many research ssues are pursued wth nonexpermental studes that seek to establsh relatonshps among or more varables E.g., correlates of ntellgence; relaton between SAT and GPA; relaton
More informationDynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)
/24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes
More informationMatrix Approximation via Sampling, Subspace Embedding. 1 Solving Linear Systems Using SVD
Matrx Approxmaton va Samplng, Subspace Embeddng Lecturer: Anup Rao Scrbe: Rashth Sharma, Peng Zhang 0/01/016 1 Solvng Lnear Systems Usng SVD Two applcatons of SVD have been covered so far. Today we loo
More informationGlobal Sensitivity. Tuesday 20 th February, 2018
Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationSome Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS)
Some Comments on Acceleratng Convergence of Iteratve Sequences Usng Drect Inverson of the Iteratve Subspace (DIIS) C. Davd Sherrll School of Chemstry and Bochemstry Georga Insttute of Technology May 1998
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have
More informationNorms, Condition Numbers, Eigenvalues and Eigenvectors
Norms, Condton Numbers, Egenvalues and Egenvectors 1 Norms A norm s a measure of the sze of a matrx or a vector For vectors the common norms are: N a 2 = ( x 2 1/2 the Eucldean Norm (1a b 1 = =1 N x (1b
More informationPHZ 6607 Lecture Notes
NOTE PHZ 6607 Lecture Notes 1. Lecture 2 1.1. Defntons Books: ( Tensor Analyss on Manfols ( The mathematcal theory of black holes ( Carroll (v Schutz Vector: ( In an N-Dmensonal space, a vector s efne
More informationNegative Binomial Regression
STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationStructure from Motion. Forsyth&Ponce: Chap. 12 and 13 Szeliski: Chap. 7
Structure from Moton Forsyth&once: Chap. 2 and 3 Szelsk: Chap. 7 Introducton to Structure from Moton Forsyth&once: Chap. 2 Szelsk: Chap. 7 Structure from Moton Intro he Reconstructon roblem p 3?? p p 2
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationρ some λ THE INVERSE POWER METHOD (or INVERSE ITERATION) , for , or (more usually) to
THE INVERSE POWER METHOD (or INVERSE ITERATION) -- applcaton of the Power method to A some fxed constant ρ (whch s called a shft), x λ ρ If the egenpars of A are { ( λ, x ) } ( ), or (more usually) to,
More information1 Matrix representations of canonical matrices
1 Matrx representatons of canoncal matrces 2-d rotaton around the orgn: ( ) cos θ sn θ R 0 = sn θ cos θ 3-d rotaton around the x-axs: R x = 1 0 0 0 cos θ sn θ 0 sn θ cos θ 3-d rotaton around the y-axs:
More informationYukawa Potential and the Propagator Term
PHY304 Partcle Physcs 4 Dr C N Booth Yukawa Potental an the Propagator Term Conser the electrostatc potental about a charge pont partcle Ths s gven by φ = 0, e whch has the soluton φ = Ths escrbes the
More informationSome modelling aspects for the Matlab implementation of MMA
Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton
More information= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system.
Chapter Matlab Exercses Chapter Matlab Exercses. Consder the lnear system of Example n Secton.. x x x y z y y z (a) Use the MATLAB command rref to solve the system. (b) Let A be the coeffcent matrx and
More informationSIO 224. m(r) =(ρ(r),k s (r),µ(r))
SIO 224 1. A bref look at resoluton analyss Here s some background for the Masters and Gubbns resoluton paper. Global Earth models are usually found teratvely by assumng a startng model and fndng small
More informationSparse Methods. Sanjiv Kumar, Google Research, NY. EECS-6898, Columbia University - Fall, 2010
Sparse Methos Sanv Kumar, Google Research, NY EECS-6898, Columba Unversty - Fall, 00 Sanv Kumar /3/00 EECS6898 Large Scale Machne Learnng What s Sparsty? When a ata tem such as a vector or a matrx has
More informationTime-Varying Systems and Computations Lecture 6
Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy
More informationSalmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2
Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationOutline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique
Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng
More informationSpectral Clustering. Shannon Quinn
Spectral Clusterng Shannon Qunn (wth thanks to Wllam Cohen of Carnege Mellon Unverst, and J. Leskovec, A. Raaraman, and J. Ullman of Stanford Unverst) Graph Parttonng Undrected graph B- parttonng task:
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationNew Liu Estimators for the Poisson Regression Model: Method and Application
New Lu Estmators for the Posson Regresson Moel: Metho an Applcaton By Krstofer Månsson B. M. Golam Kbra, Pär Sölaner an Ghaz Shukur,3 Department of Economcs, Fnance an Statstcs, Jönköpng Unversty Jönköpng,
More informationChapter 24 Work and Energy
Chapter 4 or an Energ 4 or an Energ You have one qute a bt of problem solvng usng energ concepts. ac n chapter we efne energ as a transferable phscal quantt that an obect can be sa to have an we sa that
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationSupporting Information
Supportng Informaton The neural network f n Eq. 1 s gven by: f x l = ReLU W atom x l + b atom, 2 where ReLU s the element-wse rectfed lnear unt, 21.e., ReLUx = max0, x, W atom R d d s the weght matrx to
More informationLecture 20: Lift and Project, SDP Duality. Today we will study the Lift and Project method. Then we will prove the SDP duality theorem.
prnceton u. sp 02 cos 598B: algorthms and complexty Lecture 20: Lft and Project, SDP Dualty Lecturer: Sanjeev Arora Scrbe:Yury Makarychev Today we wll study the Lft and Project method. Then we wll prove
More informationLecture Nov
Lecture 18 Nov 07 2008 Revew Clusterng Groupng smlar obects nto clusters Herarchcal clusterng Agglomeratve approach (HAC: teratvely merge smlar clusters Dfferent lnkage algorthms for computng dstances
More informationExample: (13320, 22140) =? Solution #1: The divisors of are 1, 2, 3, 4, 5, 6, 9, 10, 12, 15, 18, 20, 27, 30, 36, 41,
The greatest common dvsor of two ntegers a and b (not both zero) s the largest nteger whch s a common factor of both a and b. We denote ths number by gcd(a, b), or smply (a, b) when there s no confuson
More informationCS47300: Web Information Search and Management
CS47300: Web Informaton Search and Management Probablstc Retreval Models Prof. Chrs Clfton 7 September 2018 Materal adapted from course created by Dr. Luo S, now leadng Albaba research group 14 Why probabltes
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationStatistics MINITAB - Lab 2
Statstcs 20080 MINITAB - Lab 2 1. Smple Lnear Regresson In smple lnear regresson we attempt to model a lnear relatonshp between two varables wth a straght lne and make statstcal nferences concernng that
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationCS4495/6495 Introduction to Computer Vision. 3C-L3 Calibrating cameras
CS4495/6495 Introducton to Computer Vson 3C-L3 Calbratng cameras Fnally (last tme): Camera parameters Projecton equaton the cumulatve effect of all parameters: M (3x4) f s x ' 1 0 0 0 c R 0 I T 3 3 3 x1
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed
More informationRegularized Discriminant Analysis for Face Recognition
1 Regularzed Dscrmnant Analyss for Face Recognton Itz Pma, Mayer Aladem Department of Electrcal and Computer Engneerng, Ben-Guron Unversty of the Negev P.O.Box 653, Beer-Sheva, 845, Israel. Abstract Ths
More informationInstance-Based Learning (a.k.a. memory-based learning) Part I: Nearest Neighbor Classification
Instance-Based earnng (a.k.a. memory-based learnng) Part I: Nearest Neghbor Classfcaton Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n
More information18.1 Introduction and Recap
CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng
More informationLow correlation tensor decomposition via entropy maximization
CS369H: Herarches of Integer Programmng Relaxatons Sprng 2016-2017 Low correlaton tensor decomposton va entropy maxmzaton Lecture and notes by Sam Hopkns Scrbes: James Hong Overvew CS369H). These notes
More informationClustering gene expression data & the EM algorithm
CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern
More informationMixture o f of Gaussian Gaussian clustering Nov
Mture of Gaussan clusterng Nov 11 2009 Soft vs hard lusterng Kmeans performs Hard clusterng: Data pont s determnstcally assgned to one and only one cluster But n realty clusters may overlap Soft-clusterng:
More informationEffects of Ignoring Correlations When Computing Sample Chi-Square. John W. Fowler February 26, 2012
Effects of Ignorng Correlatons When Computng Sample Ch-Square John W. Fowler February 6, 0 It can happen that ch-square must be computed for a sample whose elements are correlated to an unknown extent.
More informationLecture 3. Ax x i a i. i i
18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest
More informationModule 9. Lecture 6. Duality in Assignment Problems
Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept
More informationDynamical Systems and Information Theory
Dynamcal Systems and Informaton Theory Informaton Theory Lecture 4 Let s consder systems that evolve wth tme x F ( x, x, x,... That s, systems that can be descrbed as the evoluton of a set of state varables
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationOn a Parallel Implementation of the One-Sided Block Jacobi SVD Algorithm
Jacob SVD Gabrel Okša formulaton One-Sded Block-Jacob Algorthm Acceleratng Parallelzaton Conclusons On a Parallel Implementaton of the One-Sded Block Jacob SVD Algorthm Gabrel Okša 1, Martn Bečka, 1 Marán
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More informationThe Granular Origins of Aggregate Fluctuations : Supplementary Material
The Granular Orgns of Aggregate Fluctuatons : Supplementary Materal Xaver Gabax October 12, 2010 Ths onlne appendx ( presents some addtonal emprcal robustness checks ( descrbes some econometrc complements
More informationAPPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14
APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce
More informationEconomics 130. Lecture 4 Simple Linear Regression Continued
Economcs 130 Lecture 4 Contnued Readngs for Week 4 Text, Chapter and 3. We contnue wth addressng our second ssue + add n how we evaluate these relatonshps: Where do we get data to do ths analyss? How do
More informationConic Programming in GAMS
Conc Programmng n GAMS Armn Pruessner, Mchael Busseck, Steven Drkse, Ale Meeraus GAMS Development Corporaton INFORMS 003, Atlanta October 19- Drecton What ths talk s about Overvew: the class of conc programs
More informationCS 523: Computer Graphics, Spring Shape Modeling. PCA Applications + SVD. Andrew Nealen, Rutgers, /15/2011 1
CS 523: Computer Graphcs, Sprng 20 Shape Modelng PCA Applcatons + SVD Andrew Nealen, utgers, 20 2/5/20 emnder: PCA Fnd prncpal components of data ponts Orthogonal drectons that are domnant n the data (have
More information= z 20 z n. (k 20) + 4 z k = 4
Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus
More informationNatural Images, Gaussian Mixtures and Dead Leaves Supplementary Material
Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess
More informationPerformance of Different Algorithms on Clustering Molecular Dynamics Trajectories
Performance of Dfferent Algorthms on Clusterng Molecular Dynamcs Trajectores Chenchen Song Abstract Dfferent types of clusterng algorthms are appled to clusterng molecular dynamcs trajectores to get nsght
More informationJAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger
JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred
More informationReport on Image warping
Report on Image warpng Xuan Ne, Dec. 20, 2004 Ths document summarzed the algorthms of our mage warpng soluton for further study, and there s a detaled descrpton about the mplementaton of these algorthms.
More informationAffine transformations and convexity
Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/
More informationMaximum likelihood. Fredrik Ronquist. September 28, 2005
Maxmum lkelhood Fredrk Ronqust September 28, 2005 Introducton Now that we have explored a number of evolutonary models, rangng from smple to complex, let us examne how we can use them n statstcal nference.
More informationA Graph-Induced Pairwise Constrained Embedding Framework and Extensions under Trace Ratio Criterion for Manifold Learning
A raph-inuce Parwse Constrane Embeng Framework an Extensons uner race Rato Crteron for Manfol Learnng Zhao Zhang*, Mngbo Zhao, an ommy W. S. Chow Department of Electronc Engneerng, Cty Unversty of Hong
More informationFeature-Rich Sequence Models. Statistical NLP Spring MEMM Taggers. Decoding. Derivative for Maximum Entropy. Maximum Entropy II
Statstcal NLP Sprng 2010 Feature-Rch Sequence Models Problem: HMMs make t hard to work wth arbtrary features of a sentence Example: name entty recognton (NER) PER PER O O O O O O ORG O O O O O LOC LOC
More informationCOS 521: Advanced Algorithms Game Theory and Linear Programming
COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton
More informationChapter 9: Statistical Inference and the Relationship between Two Variables
Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,
More informationTransfer Functions. Convenient representation of a linear, dynamic model. A transfer function (TF) relates one input and one output: ( ) system
Transfer Functons Convenent representaton of a lnear, dynamc model. A transfer functon (TF) relates one nput and one output: x t X s y t system Y s The followng termnology s used: x y nput output forcng
More informationComparison of Regression Lines
STATGRAPHICS Rev. 9/13/2013 Comparson of Regresson Lnes Summary... 1 Data Input... 3 Analyss Summary... 4 Plot of Ftted Model... 6 Condtonal Sums of Squares... 6 Analyss Optons... 7 Forecasts... 8 Confdence
More informationp 1 c 2 + p 2 c 2 + p 3 c p m c 2
Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance
More informationOn a one-parameter family of Riordan arrays and the weight distribution of MDS codes
On a one-parameter famly of Roran arrays an the weght strbuton of MDS coes Paul Barry School of Scence Waterfor Insttute of Technology Irelan pbarry@wte Patrck Ftzpatrck Department of Mathematcs Unversty
More informationBuckingham s pi-theorem
TMA495 Mathematcal modellng 2004 Buckngham s p-theorem Harald Hanche-Olsen hanche@math.ntnu.no Theory Ths note s about physcal quanttes R,...,R n. We lke to measure them n a consstent system of unts, such
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More informationRELIABILITY ASSESSMENT
CHAPTER Rsk Analyss n Engneerng and Economcs RELIABILITY ASSESSMENT A. J. Clark School of Engneerng Department of Cvl and Envronmental Engneerng 4a CHAPMAN HALL/CRC Rsk Analyss for Engneerng Department
More informationLecture 2: Prelude to the big shrink
Lecture 2: Prelude to the bg shrnk Last tme A slght detour wth vsualzaton tools (hey, t was the frst day... why not start out wth somethng pretty to look at?) Then, we consdered a smple 120a-style regresson
More informationDifference Equations
Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1
More informationCS 229, Public Course Problem Set #3 Solutions: Learning Theory and Unsupervised Learning
CS9 Problem Set #3 Solutons CS 9, Publc Course Problem Set #3 Solutons: Learnng Theory and Unsupervsed Learnng. Unform convergence and Model Selecton In ths problem, we wll prove a bound on the error of
More information