Nonparametric Bayesian Matrix Factorization for Assortative Networks
|
|
- Theodore Owen
- 5 years ago
- Views:
Transcription
1 Nonparametric Bayesian Matrix Factorization for Assortative Networks Mingyuan Zhou IROM Department, McCombs School of Business Department of Statistics and Data Sciences The University of Texas at Austin 23rd European Signal Processing Conference (EUSIPCO 15) Nice, France, September 4, 15 1 / 21
2 Table of Contents Introduction Gamma process edge partition model Example results Improve EPM to model dissortativity Conclusions 2 / 21
3 Introduction Network community detection and link prediction We will focus on unweighted undirected relational networks, which can also be represented as binary symmetric adjacency matrices. Non-probabilistic community detection algorithms (see Fortunato, 10 for a comprehensive review) Examples: Modularity maximization (Newman and Girvan, 04) Click percolation (Palla et al., 05) Restrictions: Usually cannot be used to generate networks and predict missing edges (links) Often need to tune the number of communities 3 / 21
4 Introduction Network community detection and link prediction Generative network models Model assumptions are clearly stated in a hierarchical model Generate random networks Detect latent communities Detect community-community interactions Predict missing edges (links) Automatically infer the number of communities with nonparametric Bayesian priors 4 / 21
5 Introduction Assortative and dissortative relational networks Assortativity: Also known as Homophily. A subset of nodes that are densely connected to each other but sparsely to the others are often considered to belong to the same community. Example: in a social network, a community may consist of a group of closely related friends. 5 / 21
6 Introduction Assortative and dissortative relational networks Dissortativity: Also known as Stochastic Equivalence. A subset of nodes that are sparsely connected to each other but densely connected to another subset of nodes are often considered to belong to the same community. Example: in a predator-prey network, a community may consist of a group of animals that play similar roles in the ecosystem but not necessarily prey on each other. 6 / 21
7 Introduction Probabilistic models for network analysis Latent class model Stochastic blockmodel (Holland et al., 1983; Nowichi and Snijders, 01) Infinite relational model (Kemp et al., 06) Mixed-membership stochastic blockmodel (Airoldi et al., 08) Latent factor model Eigenmodel (Hoff, 08) Infinite latent feature relational model (Miller et al., 09; Morup et al., 11) Community-affiliation graph model (Yang and Leskovec, 12, 14) Detection of disjoint or overlapping communities Interpretation of latent representations Prediction of missing edges 7 / 21
8 Gamma process edge partition model Gamma process edge partition model Detect overlapping communities and predict missing edges As a latent factor model: Connect each binary edge to a latent count via the Bernoulli-Poisson link Factorize the latent count matrix As a latent class model: Explicitly partition each observed edge into multiple latent communities Implicitly assign a node to multiple communities based on how its edges are partitioned (overlapping communities) Designed to analyze assortative networks Can be generalized to capture dissortativity by modeling community-community interactions 8 / 21
9 Gamma process edge partition model Gamma process edge partition model Hierarchical model b ij = 1(m ij 1), K m ij = m ijk, m ijk Po (r k φ ik φ jk ), k=1 φ ik Gam(a i, 1/c i ), a i Gam(e 0, 1/f 0 ), r k Gam(γ 0 /K, 1/c 0 ), γ 0 Gam(e 1, 1/f 1 ). The Bernoulli-Poisson link [ ] K b ij Bernoulli 1 exp ( r k φ ik φ jk ). k=1 9 / 21
10 Gamma process edge partition model The Bernoulli-Poisson link Thresholding a count variable to obtain a binary variable b = 1(m 1), m Po(λ) (1) Marginal likelihood of the Bernoulli-Poisson link ( b Ber 1 e λ). The conditional posterior of the latent count m follows a truncated Poisson distribution, expressed as (m b, λ) b Po + (λ), Use rejection sampling to sample from the truncated Poisson distribution. Conceptual and computational advantages over the probit and logistic links. 10 / 21
11 Gamma process edge partition model Overlapping community structures Edge partition model (EPM) under data augmentation m ij = k m ijk, m ijk Po (r k φ ik φ jk ). m ijk represents how often nodes i and j interact due to their affiliations with community k. r k φ ik j i φ jk measures how strongly node i is affiliated with community k, and the latent count m i k := N j=i+1 m ijk + i 1 j=1 m jik (2) represents how often node i is connected to the other nodes due to its affiliation with community k. Assign node i to multiple communities in {k : m i k 1}, or (hard) assign it to a single community using either argmax(r k φ ik j i φ jk) or argmax(m i k ). k k 11 / 21
12 Gamma process edge partition model Related model: community-affiliation graph model (AGM) A restricted version of the gamma process EPM, expressed as [ b ij Ber 1 e ] ɛ exp( r k φ ik φ jk ), k where ɛ R + and φ ik {0, 1}, could be considered as a nonparametric Bayesian generalization of the community-affiliation graph model (AGM) of (Yang and Leskovec, 12, 14). It is argued in AGM that all previous community detection methods, including clique percolation and MMSB, would fail to detect communities with dense overlaps, due to a hidden assumption that a community s overlapping parts are less densely connected than its non-overlapping ones. The EPM does not make such a restrictive assumption; and beyond the AGM, it does not restrict φ ik to be binary. 12 / 21
13 Gamma process edge partition model Data augmentation and marginalization Using the Poisson additive property, we have ( m i k Po r k φ ik j i φ jk Marginalizing out φ ik leads to ) i, m k Po (r k m ik NB (a i, p ik ), p ik := r k j i φ jk c i +r k j i φ. jk Marginalizing out r k leads to m k NB (γ 0 /K, p k ), p k := j i φ ikφ jk 2 i j i φ ikφ jk 2c 0 + i j i φ ikφ jk. Using these equations, we can develop closed-form Gibbs sampling update equations for all model parameters. ) 13 / 21
14 Gamma process edge partition model Gibbs sampling ( K ) Sample m ij. (m ij ) b ij Po + k=1 r kφ ik φ jk. ( Sample m ijk. ({m ijk } k=1:k ) Mult m ij ; {r kφ ik φ jk } k=1:k Sample a i. (l ik ) ( m i k t=1 Ber ai ( (a i ) Gam a i +t 1 ), k r k φ ik φ jk ). e 0 + k l ik, 1 f 0 + k ln(1 p ik ) ). ( ) 1 Sample φ ik. (φ ik ) Gam a i + m i k, c i +r k j i φ. jk Sample γ 0, c i and c 0. ( Sample r k. (r k ) Gam γ 0 K + m k, 1 c 0 + i j i 1 2 φ ikφ jk ). 14 / 21
15 Example results Synthetic assortative network Four communities with dense intra-community connections. The 2nd community overlaps with both the 1st and 3rd ones. (a) Ground truth (b) Adjacency matrix (c) IRM (d) Eigenmodel (e) AGM (f) GP EPM Figure: Comparison of four algorithms abilities to recover the ground-truth link probabilities using 80% of the pairs of nodes randomly selected from a synthetic relational network The number of features for the Eigenmodel is set as K = / 21
16 Example results The infinite relational model (IRM) accurately captures the community structures but produces cartoonish blocks The Eigenmodel somewhat overfits the data The AGM produces some undesired artifacts The gamma process EPM (GP-EPM) provides a reconstruction that looks most similar to the ground truth. (a) Ground truth (b) Adjacency matrix (c) IRM (d) Eigenmodel (e) AGM (f) GP EPM 16 / 21
17 Example results Both the GP-EPM and Eigenmodel perform well and clearly outperform the IRM and AGM in missing link prediction, measured by both the area under the ROC curve and the area under the precision-recall (PR) curve. Table: Comparison of four algorithms abilities to predict missing edges of a synthetic assortative network. The number of features for the Eigenmodel is set as K = 4. Model AUC-ROC AUC-PR IRM ± ± Eigenmodel ± ± AGM ± ± GP-EPM ± ± / 21
18 Improve EPM to model dissortativity Synthetic dissortative network Four communities with dense intra-community or inter-community connections. (a) Ground truth (b) Adjacency matrix (c) IRM (d) Eigenmodel (e) AGM (f) GP EPM Figure: Comparison of four algorithms abilities to recover the ground-truth link probabilities using 80% of the pairs of nodes randomly selected from a synthetic relational network that exhibits clear dissortativity. 18 / 21
19 Improve EPM to model dissortativity EPM for dissortative networks (Zhou, AIStats 15) EPM that captures community-community interactions K K b ij = 1(m ij 1), m ij = m ik1k 2j, m ik1k 2j Po (φ ik1 λ k1k 2 φ jk2 ), k 1=1 k 2=1 Use a relational hierarchical gamma process to support K = The inferred latent feature matrix {φ k } and community-community interaction rate matrix {λ k1k 2 } for the improved EPM model on Protein / 21
20 Improve EPM to model dissortativity EPM for dissortative networks (Zhou, AIStats 15) (a) Protein interaction network (b) HGP EPM (c) GP EPM (d) IRM Figure: Comparison of three models on estimating the link probabilities for the Protein230 network using 80% of its node pairs. / 21
21 Conclusions Conclusions The gamma process edge partition model (GP-EPM) provides an efficient and effective solution to model assortative relational networks The GP-EPM has limited ability to model dissortativity in relational networks. As in (Zhou, AIStats 15), to model dissortativity in relational networks, one may modify the GP-EPM to capture community-community interactions. 21 / 21
arxiv: v2 [stat.ml] 30 Dec 2015
Infinite Edge Partition Models for Overlapping Community Detection and Link Prediction Mingyuan Zhou McCombs School of Business, The University of Texas at Austin, Austin, TX 78712, USA arxiv:11.06218v2
More informationLearning latent structure in complex networks
Learning latent structure in complex networks Lars Kai Hansen www.imm.dtu.dk/~lkh Current network research issues: Social Media Neuroinformatics Machine learning Joint work with Morten Mørup, Sune Lehmann
More informationMixed Membership Stochastic Blockmodels
Mixed Membership Stochastic Blockmodels (2008) Edoardo M. Airoldi, David M. Blei, Stephen E. Fienberg and Eric P. Xing Herrissa Lamothe Princeton University Herrissa Lamothe (Princeton University) Mixed
More informationOverlapping Community Detection at Scale: A Nonnegative Matrix Factorization Approach
Overlapping Community Detection at Scale: A Nonnegative Matrix Factorization Approach Author: Jaewon Yang, Jure Leskovec 1 1 Venue: WSDM 2013 Presenter: Yupeng Gu 1 Stanford University 1 Background Community
More informationOverlapping Communities
Overlapping Communities Davide Mottin HassoPlattner Institute Graph Mining course Winter Semester 2017 Acknowledgements Most of this lecture is taken from: http://web.stanford.edu/class/cs224w/slides GRAPH
More informationScalable Gaussian process models on matrices and tensors
Scalable Gaussian process models on matrices and tensors Alan Qi CS & Statistics Purdue University Joint work with F. Yan, Z. Xu, S. Zhe, and IBM Research! Models for graph and multiway data Model Algorithm
More informationJure Leskovec Joint work with Jaewon Yang, Julian McAuley
Jure Leskovec (@jure) Joint work with Jaewon Yang, Julian McAuley Given a network, find communities! Sets of nodes with common function, role or property 2 3 Q: How and why do communities form? A: Strength
More informationConsistency Under Sampling of Exponential Random Graph Models
Consistency Under Sampling of Exponential Random Graph Models Cosma Shalizi and Alessandro Rinaldo Summary by: Elly Kaizar Remember ERGMs (Exponential Random Graph Models) Exponential family models Sufficient
More informationRandom function priors for exchangeable arrays with applications to graphs and relational data
Random function priors for exchangeable arrays with applications to graphs and relational data James Robert Lloyd Department of Engineering University of Cambridge Peter Orbanz Department of Statistics
More informationApplying Latent Dirichlet Allocation to Group Discovery in Large Graphs
Lawrence Livermore National Laboratory Applying Latent Dirichlet Allocation to Group Discovery in Large Graphs Keith Henderson and Tina Eliassi-Rad keith@llnl.gov and eliassi@llnl.gov This work was performed
More informationPriors for Random Count Matrices with Random or Fixed Row Sums
Priors for Random Count Matrices with Random or Fixed Row Sums Mingyuan Zhou Joint work with Oscar Madrid and James Scott IROM Department, McCombs School of Business Department of Statistics and Data Sciences
More informationDeep Learning Srihari. Deep Belief Nets. Sargur N. Srihari
Deep Belief Nets Sargur N. Srihari srihari@cedar.buffalo.edu Topics 1. Boltzmann machines 2. Restricted Boltzmann machines 3. Deep Belief Networks 4. Deep Boltzmann machines 5. Boltzmann machines for continuous
More informationLink Prediction. Eman Badr Mohammed Saquib Akmal Khan
Link Prediction Eman Badr Mohammed Saquib Akmal Khan 11-06-2013 Link Prediction Which pair of nodes should be connected? Applications Facebook friend suggestion Recommendation systems Monitoring and controlling
More informationLecture 6 (supplemental): Stochastic Block Models
3 Lecture 6 (supplemental): Stochastic Block Models Aaron Clauset @aaronclauset Assistant Professor of Computer Science University of Colorado Boulder External Faculty, Santa Fe Institute 2017 Aaron Clauset
More informationModularity and Graph Algorithms
Modularity and Graph Algorithms David Bader Georgia Institute of Technology Joe McCloskey National Security Agency 12 July 2010 1 Outline Modularity Optimization and the Clauset, Newman, and Moore Algorithm
More informationIntroduction to Probabilistic Machine Learning
Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning
More informationNonparametric Latent Feature Models for Link Prediction
Nonparametric Latent Feature Models for Link Prediction Kurt T. Miller EECS University of California Berkeley, CA 94720 tadayuki@cs.berkeley.edu Thomas L. Griffiths Psychology and Cognitive Science University
More informationA Modified Method Using the Bethe Hessian Matrix to Estimate the Number of Communities
Journal of Advanced Statistics, Vol. 3, No. 2, June 2018 https://dx.doi.org/10.22606/jas.2018.32001 15 A Modified Method Using the Bethe Hessian Matrix to Estimate the Number of Communities Laala Zeyneb
More informationBayesian nonparametric models of sparse and exchangeable random graphs
Bayesian nonparametric models of sparse and exchangeable random graphs F. Caron & E. Fox Technical Report Discussion led by Esther Salazar Duke University May 16, 2014 (Reading group) May 16, 2014 1 /
More informationLearning Bayesian network : Given structure and completely observed data
Learning Bayesian network : Given structure and completely observed data Probabilistic Graphical Models Sharif University of Technology Spring 2017 Soleymani Learning problem Target: true distribution
More informationMixed Membership Stochastic Blockmodels
Mixed Membership Stochastic Blockmodels Journal of Machine Learning Research, 2008 by E.M. Airoldi, D.M. Blei, S.E. Fienberg, E.P. Xing as interpreted by Ted Westling STAT 572 Final Talk May 8, 2014 Ted
More informationUndirected Graphical Models
Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationMachine Learning in Simple Networks. Lars Kai Hansen
Machine Learning in Simple Networs Lars Kai Hansen www.imm.dtu.d/~lh Outline Communities and lin prediction Modularity Modularity as a combinatorial optimization problem Gibbs sampling Detection threshold
More informationThe Origin of Deep Learning. Lili Mou Jan, 2015
The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets
More informationEfficient Online Inference for Bayesian Nonparametric Relational Models
Efficient Online Inference for Bayesian Nonparametric Relational Models Dae Il Kim, Prem Gopalan 2, David M. Blei 2, and Erik B. Sudderth Department of Computer Science, Brown University, {daeil,sudderth}@cs.brown.edu
More informationMachine Learning Summer School, Austin, TX January 08, 2015
Parametric Department of Information, Risk, and Operations Management Department of Statistics and Data Sciences The University of Texas at Austin Machine Learning Summer School, Austin, TX January 08,
More informationCS839: Probabilistic Graphical Models. Lecture 7: Learning Fully Observed BNs. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 7: Learning Fully Observed BNs Theo Rekatsinas 1 Exponential family: a basic building block For a numeric random variable X p(x ) =h(x)exp T T (x) A( ) = 1
More informationBayesian non parametric inference of discrete valued networks
Bayesian non parametric inference of discrete valued networks Laetitia Nouedoui, Pierre Latouche To cite this version: Laetitia Nouedoui, Pierre Latouche. Bayesian non parametric inference of discrete
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationBayesian Learning in Undirected Graphical Models
Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul
More information19 : Bayesian Nonparametrics: The Indian Buffet Process. 1 Latent Variable Models and the Indian Buffet Process
10-708: Probabilistic Graphical Models, Spring 2015 19 : Bayesian Nonparametrics: The Indian Buffet Process Lecturer: Avinava Dubey Scribes: Rishav Das, Adam Brodie, and Hemank Lamba 1 Latent Variable
More information39th Annual ISMS Marketing Science Conference University of Southern California, June 8, 2017
Permuted and IROM Department, McCombs School of Business The University of Texas at Austin 39th Annual ISMS Marketing Science Conference University of Southern California, June 8, 2017 1 / 36 Joint work
More informationIntroduction to Probabilistic Graphical Models
Introduction to Probabilistic Graphical Models Sargur Srihari srihari@cedar.buffalo.edu 1 Topics 1. What are probabilistic graphical models (PGMs) 2. Use of PGMs Engineering and AI 3. Directionality in
More informationLecture 16 Deep Neural Generative Models
Lecture 16 Deep Neural Generative Models CMSC 35246: Deep Learning Shubhendu Trivedi & Risi Kondor University of Chicago May 22, 2017 Approach so far: We have considered simple models and then constructed
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 11 CRFs, Exponential Family CS/CNS/EE 155 Andreas Krause Announcements Homework 2 due today Project milestones due next Monday (Nov 9) About half the work should
More informationMixed Membership Stochastic Blockmodels
Mixed Membership Stochastic Blockmodels Journal of Machine Learning Research, 2008 by E.M. Airoldi, D.M. Blei, S.E. Fienberg, E.P. Xing as interpreted by Ted Westling STAT 572 Intro Talk April 22, 2014
More informationExponential families also behave nicely under conditioning. Specifically, suppose we write η = (η 1, η 2 ) R k R p k so that
1 More examples 1.1 Exponential families under conditioning Exponential families also behave nicely under conditioning. Specifically, suppose we write η = η 1, η 2 R k R p k so that dp η dm 0 = e ηt 1
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationFinding Mixed-Memberships in Social Networks
Finding Mixed-Memberships in Social Networks Phaedon-Stelios Koutsourelakis Cornell University 369 Hollister Hall, Ithaca NY 14853 pk285@cornell.edu Tina Eliassi-Rad Lawrence Livermore National Laboratory
More informationAppendix: Modeling Approach
AFFECTIVE PRIMACY IN INTRAORGANIZATIONAL TASK NETWORKS Appendix: Modeling Approach There is now a significant and developing literature on Bayesian methods in social network analysis. See, for instance,
More informationChapter 16. Structured Probabilistic Models for Deep Learning
Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe
More informationLearning Bayesian Networks for Biomedical Data
Learning Bayesian Networks for Biomedical Data Faming Liang (Texas A&M University ) Liang, F. and Zhang, J. (2009) Learning Bayesian Networks for Discrete Data. Computational Statistics and Data Analysis,
More informationarxiv:cond-mat/ v1 [cond-mat.dis-nn] 27 Mar 2006
Statistical Mechanics of Community Detection arxiv:cond-mat/0603718v1 [cond-mat.dis-nn] 27 Mar 2006 Jörg Reichardt 1 and Stefan Bornholdt 1 1 Institute for Theoretical Physics, University of Bremen, Otto-Hahn-Allee,
More informationVariational Inference (11/04/13)
STA561: Probabilistic machine learning Variational Inference (11/04/13) Lecturer: Barbara Engelhardt Scribes: Matt Dickenson, Alireza Samany, Tracy Schifeling 1 Introduction In this lecture we will further
More informationStochastic blockmodels with a growing number of classes
Biometrika (2012), 99,2,pp. 273 284 doi: 10.1093/biomet/asr053 C 2012 Biometrika Trust Advance Access publication 17 April 2012 Printed in Great Britain Stochastic blockmodels with a growing number of
More informationAS the availability and scope of social networks
JOURNAL OF L A T E X CLASS FILES, VOL. 6, NO. 1, JANUARY 007 1 Max-Margin Nonparametric Latent Feature Models for Link Prediction Jun Zhu, Member, IEEE, Jiaming Song, Bei Chen arxiv:160.0748v1 [cs.lg]
More informationModeling homophily and stochastic equivalence in symmetric relational data
Modeling homophily and stochastic equivalence in symmetric relational data Peter D. Hoff Departments of Statistics and Biostatistics University of Washington Seattle, WA 98195-4322. hoff@stat.washington.edu
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationCS224W: Social and Information Network Analysis Jure Leskovec, Stanford University
CS224W: Social and Information Network Analysis Jure Leskovec, Stanford University http://cs224w.stanford.edu Non-overlapping vs. overlapping communities 11/10/2010 Jure Leskovec, Stanford CS224W: Social
More informationDynamic Probabilistic Models for Latent Feature Propagation in Social Networks
Dynamic Probabilistic Models for Latent Feature Propagation in Social Networks Creighton Heaukulani and Zoubin Ghahramani University of Cambridge TU Denmark, June 2013 1 A Network Dynamic network data
More informationPMR Learning as Inference
Outline PMR Learning as Inference Probabilistic Modelling and Reasoning Amos Storkey Modelling 2 The Exponential Family 3 Bayesian Sets School of Informatics, University of Edinburgh Amos Storkey PMR Learning
More informationBayes methods for categorical data. April 25, 2017
Bayes methods for categorical data April 25, 2017 Motivation for joint probability models Increasing interest in high-dimensional data in broad applications Focus may be on prediction, variable selection,
More informationGroups of vertices and Core-periphery structure. By: Ralucca Gera, Applied math department, Naval Postgraduate School Monterey, CA, USA
Groups of vertices and Core-periphery structure By: Ralucca Gera, Applied math department, Naval Postgraduate School Monterey, CA, USA Mostly observed real networks have: Why? Heavy tail (powerlaw most
More informationMultislice community detection
Multislice community detection P. J. Mucha, T. Richardson, K. Macon, M. A. Porter, J.-P. Onnela Jukka-Pekka JP Onnela Harvard University NetSci2010, MIT; May 13, 2010 Outline (1) Background (2) Multislice
More information3 : Representation of Undirected GM
10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:
More informationPerformance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project
Performance Comparison of K-Means and Expectation Maximization with Gaussian Mixture Models for Clustering EE6540 Final Project Devin Cornell & Sushruth Sastry May 2015 1 Abstract In this article, we explore
More informationA Random Dot Product Model for Weighted Networks arxiv: v1 [stat.ap] 8 Nov 2016
A Random Dot Product Model for Weighted Networks arxiv:1611.02530v1 [stat.ap] 8 Nov 2016 Daryl R. DeFord 1 Daniel N. Rockmore 1,2,3 1 Department of Mathematics, Dartmouth College, Hanover, NH, USA 03755
More information6.867 Machine learning, lecture 23 (Jaakkola)
Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context
More informationProject in Computational Game Theory: Communities in Social Networks
Project in Computational Game Theory: Communities in Social Networks Eldad Rubinstein November 11, 2012 1 Presentation of the Original Paper 1.1 Introduction In this section I present the article [1].
More informationHierarchical Mixed Membership Stochastic Blockmodels for Multiple Networks and Experimental Interventions
22 Hierarchical Mixed Membership Stochastic Blockmodels for Multiple Networks and Experimental Interventions Tracy M. Sweet Department of Human Development and Quantitative Methodology, University of Maryland,
More informationLecture 2: Simple Classifiers
CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 2: Simple Classifiers Slides based on Rich Zemel s All lecture slides will be available on the course website: www.cs.toronto.edu/~jessebett/csc412
More informationGraph Detection and Estimation Theory
Introduction Detection Estimation Graph Detection and Estimation Theory (and algorithms, and applications) Patrick J. Wolfe Statistics and Information Sciences Laboratory (SISL) School of Engineering and
More informationBayesian Nonparametrics for Speech and Signal Processing
Bayesian Nonparametrics for Speech and Signal Processing Michael I. Jordan University of California, Berkeley June 28, 2011 Acknowledgments: Emily Fox, Erik Sudderth, Yee Whye Teh, and Romain Thibaux Computer
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More informationarxiv: v2 [stat.ml] 10 Sep 2012
Distance Dependent Infinite Latent Feature Models arxiv:1110.5454v2 [stat.ml] 10 Sep 2012 Samuel J. Gershman 1, Peter I. Frazier 2 and David M. Blei 3 1 Department of Psychology and Princeton Neuroscience
More informationThe Trouble with Community Detection
The Trouble with Community Detection Aaron Clauset Santa Fe Institute 7 April 2010 Nonlinear Dynamics of Networks Workshop U. Maryland, College Park Thanks to National Science Foundation REU Program James
More informationCSC 412 (Lecture 4): Undirected Graphical Models
CSC 412 (Lecture 4): Undirected Graphical Models Raquel Urtasun University of Toronto Feb 2, 2016 R Urtasun (UofT) CSC 412 Feb 2, 2016 1 / 37 Today Undirected Graphical Models: Semantics of the graph:
More information13: Variational inference II
10-708: Probabilistic Graphical Models, Spring 2015 13: Variational inference II Lecturer: Eric P. Xing Scribes: Ronghuo Zheng, Zhiting Hu, Yuntian Deng 1 Introduction We started to talk about variational
More informationStatistical mechanics of community detection
Statistical mechanics of community detection Jörg Reichardt and Stefan Bornholdt Institute for Theoretical Physics, University of Bremen, Otto-Hahn-Allee, D-28359 Bremen, Germany Received 22 December 2005;
More informationDeep Poisson Factorization Machines: a factor analysis model for mapping behaviors in journalist ecosystem
000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050
More informationarxiv: v2 [math.st] 8 Dec 2010
New consistent and asymptotically normal estimators for random graph mixture models arxiv:1003.5165v2 [math.st] 8 Dec 2010 Christophe Ambroise Catherine Matias Université d Évry val d Essonne - CNRS UMR
More informationHierarchical Models for Social Networks
Hierarchical Models for Social Networks Tracy M. Sweet University of Maryland Innovative Assessment Collaboration November 4, 2014 Acknowledgements Program for Interdisciplinary Education Research (PIER)
More informationHaupthseminar: Machine Learning. Chinese Restaurant Process, Indian Buffet Process
Haupthseminar: Machine Learning Chinese Restaurant Process, Indian Buffet Process Agenda Motivation Chinese Restaurant Process- CRP Dirichlet Process Interlude on CRP Infinite and CRP mixture model Estimation
More informationLecture 13 : Variational Inference: Mean Field Approximation
10-708: Probabilistic Graphical Models 10-708, Spring 2017 Lecture 13 : Variational Inference: Mean Field Approximation Lecturer: Willie Neiswanger Scribes: Xupeng Tong, Minxing Liu 1 Problem Setup 1.1
More informationStudy Notes on the Latent Dirichlet Allocation
Study Notes on the Latent Dirichlet Allocation Xugang Ye 1. Model Framework A word is an element of dictionary {1,,}. A document is represented by a sequence of words: =(,, ), {1,,}. A corpus is a collection
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationRaRE: Social Rank Regulated Large-scale Network Embedding
RaRE: Social Rank Regulated Large-scale Network Embedding Authors: Yupeng Gu 1, Yizhou Sun 1, Yanen Li 2, Yang Yang 3 04/26/2018 The Web Conference, 2018 1 University of California, Los Angeles 2 Snapchat
More informationGraphRNN: A Deep Generative Model for Graphs (24 Feb 2018)
GraphRNN: A Deep Generative Model for Graphs (24 Feb 2018) Jiaxuan You, Rex Ying, Xiang Ren, William L. Hamilton, Jure Leskovec Presented by: Jesse Bettencourt and Harris Chan March 9, 2018 University
More informationMassive-scale estimation of exponential-family random graph models with local dependence
Massive-scale estimation of exponential-family random graph models with local dependence Sergii Babkin Michael Schweinberger arxiv:1703.09301v1 [stat.co] 27 Mar 2017 Abstract A flexible approach to modeling
More informationThe non-backtracking operator
The non-backtracking operator Florent Krzakala LPS, Ecole Normale Supérieure in collaboration with Paris: L. Zdeborova, A. Saade Rome: A. Decelle Würzburg: J. Reichardt Santa Fe: C. Moore, P. Zhang Berkeley:
More informationIV. Analyse de réseaux biologiques
IV. Analyse de réseaux biologiques Catherine Matias CNRS - Laboratoire de Probabilités et Modèles Aléatoires, Paris catherine.matias@math.cnrs.fr http://cmatias.perso.math.cnrs.fr/ ENSAE - 2014/2015 Sommaire
More informationProbabilistic Graphical Models
School of Computer Science Probabilistic Graphical Models Infinite Feature Models: The Indian Buffet Process Eric Xing Lecture 21, April 2, 214 Acknowledgement: slides first drafted by Sinead Williamson
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 9: Variational Inference Relaxations Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne 24/10/2011 (EPFL) Graphical Models 24/10/2011 1 / 15
More information1 Matrix notation and preliminaries from spectral graph theory
Graph clustering (or community detection or graph partitioning) is one of the most studied problems in network analysis. One reason for this is that there are a variety of ways to define a cluster or community.
More informationSampling and incomplete network data
1/58 Sampling and incomplete network data 567 Statistical analysis of social networks Peter Hoff Statistics, University of Washington 2/58 Network sampling methods It is sometimes difficult to obtain a
More informationSTA 216, GLM, Lecture 16. October 29, 2007
STA 216, GLM, Lecture 16 October 29, 2007 Efficient Posterior Computation in Factor Models Underlying Normal Models Generalized Latent Trait Models Formulation Genetic Epidemiology Illustration Structural
More informationStat 542: Item Response Theory Modeling Using The Extended Rank Likelihood
Stat 542: Item Response Theory Modeling Using The Extended Rank Likelihood Jonathan Gruhl March 18, 2010 1 Introduction Researchers commonly apply item response theory (IRT) models to binary and ordinal
More informationRandom Effects Models for Network Data
Random Effects Models for Network Data Peter D. Hoff 1 Working Paper no. 28 Center for Statistics and the Social Sciences University of Washington Seattle, WA 98195-4320 January 14, 2003 1 Department of
More informationBayesian nonparametric models for bipartite graphs
Bayesian nonparametric models for bipartite graphs François Caron Department of Statistics, Oxford Statistics Colloquium, Harvard University November 11, 2013 F. Caron 1 / 27 Bipartite networks Readers/Customers
More informationScaling Neighbourhood Methods
Quick Recap Scaling Neighbourhood Methods Collaborative Filtering m = #items n = #users Complexity : m * m * n Comparative Scale of Signals ~50 M users ~25 M items Explicit Ratings ~ O(1M) (1 per billion)
More informationImage segmentation combining Markov Random Fields and Dirichlet Processes
Image segmentation combining Markov Random Fields and Dirichlet Processes Jessica SODJO IMS, Groupe Signal Image, Talence Encadrants : A. Giremus, J.-F. Giovannelli, F. Caron, N. Dobigeon Jessica SODJO
More informationLifted and Constrained Sampling of Attributed Graphs with Generative Network Models
Lifted and Constrained Sampling of Attributed Graphs with Generative Network Models Jennifer Neville Departments of Computer Science and Statistics Purdue University (joint work with Pablo Robles Granda,
More informationNetwork Event Data over Time: Prediction and Latent Variable Modeling
Network Event Data over Time: Prediction and Latent Variable Modeling Padhraic Smyth University of California, Irvine Machine Learning with Graphs Workshop, July 25 th 2010 Acknowledgements PhD students:
More informationBias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions
- Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions Simon Luo The University of Sydney Data61, CSIRO simon.luo@data61.csiro.au Mahito Sugiyama National Institute of
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 12 Dynamical Models CS/CNS/EE 155 Andreas Krause Homework 3 out tonight Start early!! Announcements Project milestones due today Please email to TAs 2 Parameter learning
More informationModeling heterogeneity in random graphs
Modeling heterogeneity in random graphs Catherine MATIAS CNRS, Laboratoire Statistique & Génome, Évry (Soon: Laboratoire de Probabilités et Modèles Aléatoires, Paris) http://stat.genopole.cnrs.fr/ cmatias
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear
More informationLearning Bayesian networks
1 Lecture topics: Learning Bayesian networks from data maximum likelihood, BIC Bayesian, marginal likelihood Learning Bayesian networks There are two problems we have to solve in order to estimate Bayesian
More information