Clustering (Bishop ch 9)
|
|
- Doris Wilkinson
- 5 years ago
- Views:
Transcription
1 Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1
2 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure (such as Eucldean dsance) for ds-smlary Can use cluser membershp/dsances as addonal (creaed) feaures 2
3 Cluserng Examples Segmen cusomer daabase based on smlar buyng paerns. Group houses n a own no neghborhoods based on feaures (locaon, sq f, sores, lo sze) Idenfy new plan speces Idenfy smlar Web usage paerns 3
4 Cluserng Problem Gven daa D={x 1,x 2,,x n } of feaure vecors and an neger value k, he Cluserng Problem s o defne a mappng where each x s assgned o one cluser K j, 1<=j<=k. A Cluser, K j, conans precsely hose vecors mapped o. Unlke classfcaon problem, clusers are no known a pror 4
5 Impac of Oulers on Cluserng Wha are he bes wo clusers? 5
6 Types of Cluserng Herarchcal Creaes Tree of cluserngs Agglomerave (boom up merge closes ) Dvsve (op down - less common) Paronal One se of clusers creaed, usually # of clusers suppled by user Clusers can be: Overlappng (sof) / Non-overlappng (hard) 6
7 Closes Clusers? Sngle Lnk: smalles dsance beween pons Complee Lnk: larges dsance beween pons Average Lnk: average dsance beween pons Cenrod: dsance beween cenrods 7
8 Levels of Cluserng 8
9 Dendrogram Dendrogram: a ree daa srucure whch llusraes herarchcal cluserng echnques. Each level shows clusers for ha level. Leaf ndvdual clusers Roo one cluser A cluser a level s he unon of s chldren clusers a level +1. 9
10 Paronal Cluserng Nonherarchcal - creaes one level of cluserng Snce only one se of clusers s oupu, he user normally has o npu he desred number of clusers, k. Somemes ry dfferen k and use bes one 10
11 Paronal Algorhms K-Means Gaussan mxures (EM) Many, many ohers 11
12 K-means cluserng 1. Pck k sarng means, µ 1, µ 2,, µ k Can use: randomly pcked examples, perurbaons of sample mean, or equally spaced along prncple componen 2. Repea unl convergence: 1. Spl daa no k ses, S 1, S 2,,S k where x S ff µ closes mean o x 2. Updae each µ o mean of S 12
13 Lecure Noes for E Alpaydın 2010 Inroducon o Machne Learnng 2e The MIT Press (V1.0) 13
14 K-Means Example Gven: {2,4,10,12,3,20,30,11,25}, k=2 Randomly assgn means: m 1 =3,m 2 =4 K 1 ={2,3}, K 2 ={4,10,12,20,30,11,25}, m 1 =2.5,m 2 =16 K 1 ={2,3,4},K 2 ={10,12,20,30,11,25}, m 1 =3,m 2 =18 K 1 ={2,3,4,10},K 2 ={12,20,30,11,25}, m 1 =4.75,m 2 =19.6 K 1 ={2,3,4,10,11,12},K 2 ={20,30,25}, m 1 =7,m 2 =25 Sop as he clusers wh hese means say he same. 14
15 Orgnal k=2 k=3 k=10 15
16 Tabular vew of k-means µ 1 µ 2 µ 3 µ 4 x x x
17 Sof k-means cluserng µ 1 µ 2 µ 3 µ 4 x x x
18 From Sof cluserng o EM Use weghed mean based on sofcluserng weghs Sof cluser weghs are probables: P(cluser x) Uses Bayes rule: P(cluser x) prop. o P(x cluser) P(cluser) For each x, he rue cluser for x s a laen (unobserved) varable 18
19 Sof cluserng o EM 2 Assume paramerc forms for P(cluser) (mulnomal) P(x cluser) (Gaussan) Ieravely: 1. Smoohly esmae he cluser membershp (laen varables) based on daa and old parameers 2. Updae parameers o maxmze he lkelhood of daa assumng new esmaes are ruh Ths s he mxure of Gaussan EM algorhm see hp://ceseer.s.psu.edu/blmes98genle.hml 19
20 P(c) P(x c) µ 1, σ 1 µ 2, σ 2 µ 3, σ 3 µ 4, σ 4 x 1 P(x c)p(c) Norm x x
21 Expecaon-Maxmzaon (EM) Log lkelhood wh a mxure model ( ) = log p L Φ X x Φ ( ) ( ) = log p( x G )P G k =1 Assume hdden varables z, whch when known, make opmzaon much smpler Complee lkelhood, L c (Φ X,Z), n erms of x and z Incomplee lkelhood, L(Φ X), n erms of x Lecure Noes for E Alpaydın 2010 Inroducon o Machne Learnng 2e The MIT Press (V1.0) 21
22 E- and M-seps Ierae he wo seps 1. E-sep: Esmae z gven X and curren Φ 2. M-sep: Fnd new Φ gven z, X, old Φ. E - sep : Q Φ Φ l ( ) X,Φ l ( ) = E[ L C Φ X,Z ] M - sep : Φ l +1 = argmaxq ( Φ Φ l ) Φ ( ) L( Φ l X ) L Φ l +1 X An ncrease n Q ncreases ncomplee lkelhood Lecure Noes for E Alpaydın 2010 Inroducon o Machne Learnng 2e The MIT Press (V1.0) 22
23 EM as lkelhood ascen new Lkelhood curren Q (use esmaed membershp) Φ: Parameers for mxure and all G 23
24 Hddens z = 1 f x belongs o G, 0 oherwse (labels r of supervsed learnng); assume p(x G )~N(µ, ) E-sep: M-sep: EM n Gaussan Mxures P G E[ z X,Φ l ] = ( ) = S l +1 = N h h j p( x G,Φ l )P( G ) p( x G j,φ l )P G j = P( G x,φ l ) h m l +1 = h h x x l +1 ( m ) x l +1 m h ( ) T ( ) Q uses esmaed z s (he h s) n place of unknown labels Lecure Noes for E Alpaydın 2010 Inroducon o Machne Learnng 2e The MIT Press (V1.0) 24
25 P(G 1 x)=h 1 =0.5 25
26 Problems wh EM Local mnma Run several mes, ake bes resul Use good nalzaon (perhaps k-means) Degenerae Gaussans - as σ goes o zero, he lkelhood goes o Fx a lower bound on σ Los of parameers o learn Use sphercal Gaussans or shared co-varance marces (or even fxed dsrbuons) 26
27 EM summary Ierave mehod for maxmzng lkelhood General mehod - no jus for Gaussan mxures, bu also HMMs, Bayes nes, ec. Generally works well, bu can have local mnma and degenerae suaons Ges boh cluserng and dsrbuon (mxure of Gaussans) - dsrbuons can be used for Bayesan learnng (e.g. learn P(x y) usng a gaussan mxure model) 27
28 EM summary Ierave mehod for maxmzng lkelhood General mehod - no jus for Gaussan mxures, bu also HMMs, Bayes nes, ec. Generally works well, bu can have local mnma and degenerae suaons Ges boh cluserng and dsrbuon (mxure of Gaussans) - dsrbuons can be used for Bayesan learnng (e.g. learn P(x y) usng a gaussan mxure model) 28
29 Expecaon-Maxmzaon (EM) Log lkelhood wh a mxure model L ( ) ( Φ X = log p x Φ) = log k = 1 ( x G ) P( G ) G a generave model - log of sum s ough Assume hdden varables z, whch when known, make opmzaon much smpler (ell whch G ) Complee lkelhood, L c (Φ X,Z), n erms of x and z Incomplee lkelhood, L(Φ X), n erms of x p 29
30 E- and M-seps Ierae he followng wo seps: E-sep: Esmae ds. for z gven X and curren Φ M-sep: Fnd new Φ gven z, X, and old Φ. E - sep : Q M - sep : Φ ( ) [ ( ) ] l l Φ Φ = E LC Φ X,Z X, Φ l + 1 = arg max Q( Φ Φ l ) An ncrease n Q ncreases ncomplee ( l + ) ( ) lkelhood L Φ 1 X L Φ l X Φ 30
31 31 EM n Gaussan Mxures Hdden z = 1 f x belongs o G, 0 oherwse assume p(x G )~N(μ, ) E-sep: M-sep: Use esmaed labels n place of unknown labels [ ] ( ) ( ) ( ) ( ) ( ) l j j l j l l h P P p P p, z E = = Φ Φ Φ Φ, G G, G G G, X x x x ( ) ( )( ) = = = T l l l l h h h h N h P m x m x x m S G
32 EM n Gaussan Mxures Hddens z = 1 f x belongs o G, 0 oherwse assume p(x G )~N(μ, ) E - M - sep : sep : Q Φ ( ) [ ( ) ] l l Φ Φ = E LC Φ X,Z X, Φ l + 1 = arg max Q( Φ Φ l ) Φ L c (Φ X,Z) = log(p(x,z ) Φ) Q(Φ new Φ old ) = Z ( log(p(x,z ) Φ old )) P(Z X, Φ) 32
33 Afer Cluserng Dmensonaly reducon mehods fnd correlaons beween feaures and group feaures Cluserng mehods fnd smlares beween nsances and group nsances Allows knowledge exracon hrough number of clusers, pror probables, cluser parameers,.e., cener, range of feaures. 33
34 Cluserng as Preprocessng Esmaed group labels h j (sof) or b j (hard) may be seen as he dmensons of a new k dmensonal space, where we can hen learn our dscrmnan or regressor. Local represenaon (only one b j s 1, all ohers are 0; only few h j are nonzero) vs Dsrbued represenaon (Afer PCA; all z j are nonzero) 34
35 Mxure of Mxures In classfcaon, he npu comes from a mxure of classes (supervsed). If each class s also a mxure, e.g., of Gaussans, (unsupervsed), we have a mxure of mxures: p k ( x C ) = ( ) ( ) p x Gj P Gj p j = 1 K ( x) = p( x C ) ( ) P C = 1 35
36 Cluserng vs. Classfcaon Less pror knowledge Number of clusers (may be assumed) Meanng of clusers no assumed Unsupervsed learnng - no labels 36
37 Cluser Parameers m s h feaure vecor n cluser m 37
38 Cluserng Issues Ouler handlng Dynamc daa Inerpreng resuls Evaluang resuls Number of clusers Daa o be used Scalably 38
39 Herarchcal Cluserng Clusers are creaed n levels acually creang ses of clusers a each level. Agglomerave Inally each em n s own cluser Ieravely clusers are merged ogeher Boom Up Dvsve Inally all ems n one cluser Large clusers are successvely dvded Top Down 39
Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,
Lecure Sdes for INTRODUCTION TO Machne Learnng ETHEM ALPAYDIN The MIT Press, 2004 aaydn@boun.edu.r h://www.cme.boun.edu.r/~ehem/2m CHAPTER 7: Cuserng Semaramerc Densy Esmaon Paramerc: Assume a snge mode
More informationCHAPTER 7: CLUSTERING
CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,
More informationMachine Learning 2nd Edition
INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle
More informationCS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4
CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped
More informationCS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering
CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc
More informationAnomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar
Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa
More informationINTRODUCTION TO MACHINE LEARNING 3RD EDITION
ETHEM ALPAYDIN The MIT Press, 2014 Lecure Sdes for INTRODUCTION TO MACHINE LEARNING 3RD EDITION aaydn@boun.edu.r h://www.ce.boun.edu.r/~ehe/23e CHAPTER 7: CLUSTERING Searaerc Densy Esaon 3 Paraerc: Assue
More informationOutline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model
Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon
More informationCHAPTER 10: LINEAR DISCRIMINATION
CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g
More informationMath 128b Project. Jude Yuen
Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally
More informationBayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance
INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule
More informationLecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,
Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no
More informationAdvanced Machine Learning & Perception
Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel
More informationLearning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015
/4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse
More informationChapter 4. Neural Networks Based on Competition
Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no
More informationVariants of Pegasos. December 11, 2009
Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on
More informationLecture 6: Learning for Control (Generalised Linear Regression)
Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson
More informationLecture VI Regression
Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M
More informationClustering with Gaussian Mixtures
Noe o oher eachers and users of hese sldes. Andrew would be delghed f you found hs source maeral useful n gvng your own lecures. Feel free o use hese sldes verbam, or o modfy hem o f your own needs. PowerPon
More informationRobust and Accurate Cancer Classification with Gene Expression Profiling
Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem
More information( ) [ ] MAP Decision Rule
Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure
More informationDiscrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition
EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples
More informationLecture 11 SVM cont
Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc
More informationFall 2010 Graduate Course on Dynamic Learning
Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/
More informationAn introduction to Support Vector Machine
An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,
More informationHidden Markov Models
11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc
More informationNormal Random Variable and its discriminant functions
Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The
More informationMachine Learning Linear Regression
Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)
More informationCHAPTER 2: Supervised Learning
HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu
More informationCHAPTER 5: MULTIVARIATE METHODS
CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he
More informationIntroduction to Boosting
Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled
More informationJohn Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany
Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy
More informationIn the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!
ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal
More informationIntroduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms
Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably
More informationLecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,
Lecure ldes for INRODUCION O Machne Learnng EHEM ALPAYDIN he MI Press, 004 alpaydn@boun.edu.r hp://.cpe.boun.edu.r/~ehe/l CHAPER 6: Densonaly Reducon Why Reduce Densonaly?. Reduces e copley: Less copuaon.
More informationDynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005
Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc
More informationComputing Relevance, Similarity: The Vector Space Model
Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are
More informationHidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University
Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden
More informationRobustness Experiments with Two Variance Components
Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference
More informationJ i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.
umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal
More informationSolution in semi infinite diffusion couples (error function analysis)
Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of
More informationV.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS
R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon
More informationLecture 2 L n i e n a e r a M od o e d l e s
Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one
More informationAppendix to Online Clustering with Experts
A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper
More informationTSS = SST + SSE An orthogonal partition of the total SS
ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally
More informationFitting a Conditional Linear Gaussian Distribution
Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable
More informationSingle-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method
10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho
More informationPattern Classification (III) & Pattern Verification
Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum
More informationProbabilistic & Unsupervised Learning. Factored Variational Approximations and Variational Bayes. Expectations in Statistical Modelling
Expecaons n Sascal Modellng Probablsc & Unsupervsed Learnng Parameer esmaon ˆ = argmax dz P(Z )P(X Z, ) Facored Varaonal Approxmaons and Varaonal Bayes Maneesh Sahan maneesh@gasby.ucl.ac.uk Gasby Compuaonal
More informationCS286.2 Lecture 14: Quantum de Finetti Theorems II
CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2
More information( ) () we define the interaction representation by the unitary transformation () = ()
Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger
More informationA New Method for Computing EM Algorithm Parameters in Speaker Identification Using Gaussian Mixture Models
0 IACSI Hong Kong Conferences IPCSI vol. 9 (0) (0) IACSI Press, Sngaore A New ehod for Comung E Algorhm Parameers n Seaker Idenfcaon Usng Gaussan xure odels ohsen Bazyar +, Ahmad Keshavarz, and Khaoon
More informationPHYS 1443 Section 001 Lecture #4
PHYS 1443 Secon 001 Lecure #4 Monda, June 5, 006 Moon n Two Dmensons Moon under consan acceleraon Projecle Moon Mamum ranges and heghs Reerence Frames and relae moon Newon s Laws o Moon Force Newon s Law
More informationGMM parameter estimation. Xiaoye Lu CMPS290c Final Project
GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π
More informationGraduate Macroeconomics 2 Problem set 5. - Solutions
Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K
More informationAn Integrated and Interactive Video Retrieval Framework with Hierarchical Learning Models and Semantic Clustering Strategy
An Inegraed and Ineracve Vdeo Rereval Framewor wh Herarchcal Learnng Models and Semanc Cluserng Sraegy Na Zhao, Shu-Chng Chen, Me-Lng Shyu 2, Suar H. Rubn 3 Dsrbued Mulmeda Informaon Sysem Laboraory School
More informationChapter 6: AC Circuits
Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.
More informationLecture 18: The Laplace Transform (See Sections and 14.7 in Boas)
Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on
More informationWiH Wei He
Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground
More informationAdvanced time-series analysis (University of Lund, Economic History Department)
Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng
More informationHow about the more general "linear" scalar functions of scalars (i.e., a 1st degree polynomial of the following form with a constant term )?
lmcd Lnear ransformaon of a vecor he deas presened here are que general hey go beyond he radonal mar-vecor ype seen n lnear algebra Furhermore, hey do no deal wh bass and are equally vald for any se of
More informationOrdinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s
Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class
More informationCSCE 478/878 Lecture 5: Artificial Neural Networks and Support Vector Machines. Stephen Scott. Introduction. Outline. Linear Threshold Units
(Adaped from Ehem Alpaydn and Tom Mchell) Consder humans: Toal number of neurons Neuron schng me 3 second (vs ) Connecons per neuron 4 5 Scene recognon me second nference seps doesn seem lke enough ) much
More informationDishonest casino as an HMM
Dshnes casn as an HMM N = 2, ={F,L} M=2, O = {h,} A = F B= [. F L F L 0.95 0.0 0] h 0.5 0. L 0.05 0.90 0.5 0.9 c Deva ubramanan, 2009 63 A generave mdel fr CpG slands There are w hdden saes: CpG and nn-cpg.
More informationAppendix H: Rarefaction and extrapolation of Hill numbers for incidence data
Anne Chao Ncholas J Goell C seh lzabeh L ander K Ma Rober K Colwell and Aaron M llson 03 Rarefacon and erapolaon wh ll numbers: a framewor for samplng and esmaon n speces dversy sudes cology Monographs
More informationMachine Learning 4771
ony Jebara, Columbia Universiy achine Learning 4771 Insrucor: ony Jebara ony Jebara, Columbia Universiy opic 20 Hs wih Evidence H Collec H Evaluae H Disribue H Decode H Parameer Learning via JA & E ony
More informationEcon107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)
Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure
More informationDepartment of Economics University of Toronto
Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of
More informationClustering gene expression data & the EM algorithm
CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern
More informationDigital Speech Processing Lecture 20. The Hidden Markov Model (HMM)
Dgal Speech Processng Lecure 20 The Hdden Markov Model (HMM) Lecure Oulne Theory of Markov Models dscree Markov processes hdden Markov processes Soluons o he Three Basc Problems of HMM s compuaon of observaon
More informationComb Filters. Comb Filters
The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of
More informationConsider processes where state transitions are time independent, i.e., System of distinct states,
Dgal Speech Processng Lecure 0 he Hdden Marov Model (HMM) Lecure Oulne heory of Marov Models dscree Marov processes hdden Marov processes Soluons o he hree Basc Problems of HMM s compuaon of observaon
More informationLecture Nov
Lecture 18 Nov 07 2008 Revew Clusterng Groupng smlar obects nto clusters Herarchcal clusterng Agglomeratve approach (HAC: teratvely merge smlar clusters Dfferent lnkage algorthms for computng dstances
More informationUNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION
INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he
More informationConstrained-Storage Variable-Branch Neural Tree for. Classification
Consraned-Sorage Varable-Branch Neural Tree for Classfcaon Shueng-Ben Yang Deparmen of Dgal Conen of Applcaon and Managemen Wenzao Ursulne Unversy of Languages 900 Mnsu s oad Kaohsng 807, Tawan. Tel :
More informationCHAPTER 10: LINEAR DISCRIMINATION
HAPER : LINEAR DISRIMINAION Dscmnan-based lassfcaon 3 In classfcaon h K classes ( k ) We defned dsmnan funcon g () = K hen gven an es eample e chose (pedced) s class label as f g () as he mamum among g
More information5th International Conference on Advanced Design and Manufacturing Engineering (ICADME 2015)
5h Inernaonal onference on Advanced Desgn and Manufacurng Engneerng (IADME 5 The Falure Rae Expermenal Sudy of Specal N Machne Tool hunshan He, a, *, La Pan,b and Bng Hu 3,c,,3 ollege of Mechancal and
More informationMACHINE LEARNING. Learning Bayesian networks
Iowa Sae Unversy MACHINE LEARNING Vasan Honavar Bonformacs and Compuaonal Bology Program Cener for Compuaonal Inellgence, Learnng, & Dscovery Iowa Sae Unversy honavar@cs.asae.edu www.cs.asae.edu/~honavar/
More informationDEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL
DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA
More informationChapters 2 Kinematics. Position, Distance, Displacement
Chapers Knemacs Poson, Dsance, Dsplacemen Mechancs: Knemacs and Dynamcs. Knemacs deals wh moon, bu s no concerned wh he cause o moon. Dynamcs deals wh he relaonshp beween orce and moon. The word dsplacemen
More informationNon-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important
on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LTU, decision
More informationFast Universal Background Model (UBM) Training on GPUs using Compute Unified Device Architecture (CUDA)
Inernaonal Journal of Elecrcal & Compuer Scences IJECS-IJENS Vol: No: 04 5 Fas Unversal Background Model (UBM) ranng on GPUs usng Compue Unfed Devce Archecure (CUDA) M. Azhar, C. Ergün. Compuer Engneerng
More informationThis document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.
Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,
More informationForecasting Time Series with Multiple Seasonal Cycles using Neural Networks with Local Learning
Forecasng Tme Seres wh Mulple Seasonal Cycles usng Neural Neworks wh Local Learnng Grzegorz Dudek Deparmen of Elecrcal Engneerng, Czesochowa Unversy of Technology, Al. Arm Krajowej 7, 42-200 Czesochowa,
More informationJanuary Examinations 2012
Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons
More informationStatistical Paradigm
EE 688 Overvew of Sascal Models for Vdeo Indexng Prof. ShhFu Chang Columba Unversy TA: Erc Zavesky Fall 7, Lecure 4 Course web se: hp://www.ee.columba.edu/~sfchang/course/sva Sascal Paradgm Many problems
More informationProfessor Joseph Nygate, PhD
Professor Joseph Nygae, PhD College of Appled Scence and Technology Aprl, 2018 } Wha s AI and Machne Learnng ML) 10 mnues } Eample ML algorhms 15 mnues } Machne Learnng n Telecom 15 mnues } Do Machnes
More informationIncluding the ordinary differential of distance with time as velocity makes a system of ordinary differential equations.
Soluons o Ordnary Derenal Equaons An ordnary derenal equaon has only one ndependen varable. A sysem o ordnary derenal equaons consss o several derenal equaons each wh he same ndependen varable. An eample
More informationStructural Optimization Using Metamodels
Srucural Opmzaon Usng Meamodels 30 Mar. 007 Dep. o Mechancal Engneerng Dong-A Unvers Korea Kwon-Hee Lee Conens. Numercal Opmzaon. Opmzaon Usng Meamodels Impac beam desgn WB Door desgn 3. Robus Opmzaon
More informationEM and Structure Learning
EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder
More informationGait Tracking and Recognition Using Person-Dependent Dynamic Shape Model
Ga Trackng and Recognon Usng Person-Dependen Dynamc Shape Model Chan-Su Lee and Ahmed Elgammal Rugers Unversy Pscaaway, NJ 8854 USA Compuer Scence Deparmen {chansu,elgammal}@cs.rugers.edu Absrac Characerscs
More informationLi An-Ping. Beijing , P.R.China
A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.
More informationMechanics Physics 151
Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu
More informationA Deterministic Algorithm for Summarizing Asynchronous Streams over a Sliding Window
A Deermnsc Algorhm for Summarzng Asynchronous Sreams over a Sldng ndow Cosas Busch Rensselaer Polyechnc Insue Srkana Trhapura Iowa Sae Unversy Oulne of Talk Inroducon Algorhm Analyss Tme C Daa sream: 3
More information2/20/2013. EE 101 Midterm 2 Review
//3 EE Mderm eew //3 Volage-mplfer Model The npu ressance s he equalen ressance see when lookng no he npu ermnals of he amplfer. o s he oupu ressance. I causes he oupu olage o decrease as he load ressance
More informationChapter Lagrangian Interpolation
Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and
More informationNew M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)
Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor
More informationMechanics Physics 151
Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as
More information