On identification of multi-factor models with correlated residuals

Size: px
Start display at page:

Download "On identification of multi-factor models with correlated residuals"

Transcription

1 Biometrika (2004), 91, 1, pp Biometrika Trust Printed in Great Britain On identification of multi-factor models with correlated residuals BY MICHEL GRZEBYK Department of Pollutants Metrology, INRS, Avenue de Bourgogne, F Vandœuvre lès Nancy Cedex, France PASCAL WILD AND DOMINIQUE CHOUANIÈRE Department of Epidemiology, INRS, Avenue de Bourgogne, F Vandœuvre lès Nancy Cedex, France SUMMARY We specify some conditions for the identification of a multi-factor model with correlated residuals, uncorrelated factors and zero restrictions in the factor loadings. These conditions are derived from the results of Stanghellini (1997) and Vicard (2000) which deal with single-factor models with zero restrictions in the concentration matrix. Like these authors, we make use of the complementary graph of residuals and the conditions build on the role of odd cycles in this graph. However, in contrast to these authors, we consider the case where the conditional dependencies of the residuals are expressed in terms of a covariance matrix rather than its inverse, the concentration matrix. We first derive the corresponding condition for identification of single-factor models with structural zeros in the covariance matrix of the residuals. This is extended to the case where some factor loadings are constrained to be zero. We use these conditions to obtain a sufficient and a necessary condition for identification of multi-factor models. Some key words: Complementary graph; Covariance graph; Odd cycle; Structural constraint. 1. INTRODUCTION Our work is motivated by neurotoxicology. Several test variables coming from different neurobehavioural tests were referred to a limited number of unmeasured mental functions, or factors, each influencing some but not all of these variables. However, other factors of no scientific interest, such as the experimental conditions, may also influence some test variables. This situation can be expressed as a multi-factor model with some factor loadings constrained to zero, expressing the conditional independence between some test variables and some mental functions represented as latent factors. These zero factor loadings are termed structural zero factor loadings. For example, if variables exploring long-term memory have been measured along with others, like the simple reaction time which explores nervous motor function, the reaction time performance may be considered independent of long-term memory, given the latent factor characterising nervous motor function. We suppose here that, if all factors are included, the test variables are independent conditionally on the factors.

2 142 M. GRZEBYK, P. WILD AND D. CHOUANIÈRE However, we may want to marginalise over the factors with no scientific content, such as the level of mental concentration of the experimental subject on the neurobehavioural tests. This implies some marginal correlations among the test variables with a common factor over which we marginalised, conditional on the remaining factors. The problem we consider in this paper is the problem of identification. We do not consider of the problem of the existence of model in the words of Anderson & Rubin (1956) nor the problem of estimation. Anderson & Rubin (1956) gave some conditions for identification of multi-factor models with uncorrelated residuals and Giudici & Stanghellini (2001) gave a sufficient condition for identification of a subclass of such multi-factor models in which each observed variable is a response to one factor only and the dependencies between the residuals are expressed in terms of zero coefficients in the concentration matrix, i.e. the inverse of the covariance matrix. In this paper we build mostly on the work by Vicard (2000), who gave a necessary and sufficient condition for identification of such single-factor models. The main difference is that the independence structure in the residuals we shall consider is expressed in terms of zero restrictions on the correlation coefficients. We further restrict ourselves to the case of uncorrelated factors. After we introduce notation in 2, 3 gives a necessary and sufficient condition for identification of single-factor models with correlated residuals when the dependencies between the residuals are expressed in terms of zero covariance coefficients. Section 4 gives both a necessary and a sufficient condition for the identification of the considered multi-factor models with uncorrelated factors and correlated errors. These results are illustrated by a series of examples showing when our results can or cannot be applied. 2. NOTATION AND DEFINITIONS 2 1. T he problem We consider the multi-factor model X=lj+d, (1) where X is a vector of q observed variables X (u=1,...,q) with mean 0 and covariance u matrix S, j is a vector of p independent normally distributed factors j (i=1,...,p) each i with mean 0 and variance 1, l is a q p matrix of factor loadings and d is a vector of q normally distributed residuals with mean 0. The covariance matrix of d, denoted by H, is the covariance matrix of X conditional on j. It is assumed that cov (j, j)=i and p cov (j, d)=0. Furthermore, the factor loadings l contain structural zeros l =0 for some u,i pairs (X, j ). These structural constraints are the expression of the substantive knowledge u i about the independence of some X s from some factors, conditional on the other factors. We u further assume that H comprises structural zeros, cov (X, X j)=0, according to substantive u v knowledge. All the other parameters are assumed nonzero. According to (1), the implied covariance model of the observed variables expressed as a function of the parameters l and H is then S(l, H)=llT+H. (2) The model is said to be identified if, for two sets of parameters (l, H) and (l, H ) satisfying the structural constraints, S(l, H)=S(l, H ) implies that l=l and H=H. A well-known first necessary condition for identification of such a model is that the number of parameters be less than or equal to the number of nonnull relationships in equation (2).

3 Multi-factor models 143 As the identification issues are based solely on the moments, normality assumptions are not central to our considerations Graphical representation Such factorial models can be considered as chain graphs, denoted by G=(K, E), organised in two boxes. The set of vertices, K, consists of the factors j in the first box and the observed variables X in the second box. The set of edges, E, consists of the subset of the directed edges E X j ={(u, i):l ui N0} from the box of factors to the box of variables and the subset of the undirected edges E ={(u, v) : h uv N0 and unv} within the box of variables. The subgraph G =(X, E ) is the covariance graph (Cox & Wermuth, 1996, p. 30) of the observed variables conditional on the factors, that is the covariance graph of the errors d, which from now on we call the conditional covariance graph. We define its complementary graph as G9 =(X, E9 ) with E9 ={(u, v) : h uv =0}. Following the convention defined by Cox & Wermuth (1996), the covariance graph G is visualised with undirected dashed lines within the box of the observed variables since an absence of edge means a zero covariance between observed variables, conditional on the factors j. Likewise, chain graphs are read from right to left; directed edges are visualised as directed full arrows pointing from the factors on the right to the observed variables on the left, indicating a dependency of a variable from a factor conditional on all other factors but not on the other variables. Examples of chain graphs and their graphical representation are given throughout the text. 3. SINGLE-FACTOR MODELS 3 1. Preamble Vicard (2000) gave a necessary and sufficient condition for identification of single-factor models with correlated errors. This condition is based on the complementary graph of the conditional concentration graph. The result holds also when the dependencies are expressed in terms of a covariance graph, and if structural zero constraints on the factor loadings are allowed Structural zeros in the covariance matrix In this subsection, we suppose that all the factor loadings are nonzero. THEOREM 1. A necessary and suycient condition for a single-factor model to be identified is that the complementary graph of the conditional covariance graph G9 satisfies the following two conditions: (i) each connectivity component contains at least one odd cycle, (ii) the sign of one factor loading per connectivity component is given. Proof. Let (L, H) be the set of parameters representing the factor loadings and the conditional covariance matrix. The parameters L are nonzero whereas some H are zero if i uv and only if (u, v)µe9. According to (2), to establish the identification of the model, we have to check for the uniqueness of the solution to the system L L =l l ((u, v)µe9 ), (3) u v u v H +L L =h +l l ((u, v)µe or u=v), (4) uv u v uv u v in which (L, H) are constants and (l, h) are the unknowns.

4 144 M. GRZEBYK, P. WILD AND D. CHOUANIÈRE According to the structure of the system, each solution for l leads to a unique solution for h. If G9 contains more than one connectivity component, the system (3) can be split into disjoint subsystems, each one corresponding to a single connectivity component. The identification of l can be established independently in each connectivity component. Subsequently, we suppose that G9 is connected and the proof applies to each connectivity component. Let X and X be two distinct but arbitrary variables. Then there exists at least one 1 2 path connecting the corresponding nodes in G9 and (3) leads to l = 2 ql 1 L 2 /l, if the number of edges in the path is odd, (5) 1 l L /L, if the number of edges in the path is even. (6) We first prove sufficiency. Suppose there exists an odd cycle in G9 and suppose that X 1 and X 2 belong to this cycle. Then there exist two distinct paths from X 1 to X 2 through this cycle. As the cycle is odd, one of the paths is odd and the other is even, so that both (5) and (6) apply and give l2 1 =L2 1. (7) Thus l 1 =al 1 with a=±1 for any nodes in the odd cycle. It follows that, if X 1 is in the odd cycle and X 2 is not in the odd cycle, then both equations (5) and (6) reduce to l 2 =al 2, whatever the length of the path. If the sign of one factor loading is fixed a priori then a=1 and l u =L u for all vertices. Substituting l u by L u in (4) gives h uv =H uv for all (u, v) ine or u=v. Hence the model is identified. We next prove the necessity. Suppose that there is no odd cycle in G9. Two cases are possible. First, if there is no cycle, there exists only a single path between two vertices. Therefore, for any choice of l 1, either (5) or (6) provides a solution for any l 2, which means that the model is not identified. Secondly, suppose that G9 contains even cycles and assign an arbitrary value to l 1. Given any other distinct node X 2, all the paths connecting X 1 and X 2 have the same parity because G9 does not contain an odd cycle. Then all the paths connecting X 1 and X 2 lead to the same equation which expresses l 2 as a function of l 1 either through (5) or through (6). Thus the model is not identified. % Note that the necessary condition has been derived before by Marchetti & Stanghellini (1996) Structural zeros in the factor loadings of a single-factor model We suppose that the first q* factor loadings are nonzero and that the last q q* loadings are zeros. This situation is obtained after marginalising over some factors of no substantive interest. Furthermore, the results presented in this section will be used for the identification of multi-factor models. Let X(j)={X 1,...,X q* } be the subset of children of j; this is the subset of observed variables on which the latent factor loads. Let G* X(j).j denote the subgraph of G induced by X(j) and let G9 * X(j).j denote its complement. The following corollary results from Theorem 1.

5 Multi-factor models 145 COROLLARY 1. A necessary and suycient condition for a single-factor model with zero factor loadings to be identified is that the graph G9 * X(j).j satisfies the following conditions: (i) each connectivity component contains at least one odd cycle, (ii) the sign of one factor loading per connectivity component is given. Proof. The vector of factor loadings l can be written lt=(l*t, 0,..., 0), where l* is the q*-vector of nonzero factor loadings. Then llt is a block-diagonal matrix, llt= Al*l*T 0 0 0B. The expression of S(l, H) in (2) leads to the system of equations, for u>q* or v>q*, (8) uv s =Gh l l, for (u, uv u v v)µe9 *, (9) X(j).j h +l l, for (u, v)µe* or u=v, (10) uv u v X(j).j where s uv is the (u, v) entry in S(l, H). Equation (8) identifies h uv for u>q* orv>q*. Equations (9) and (10) correspond to the equations of identification of the single-factor model without zero constraints on the factor loadings, induced by the children of j. The complementary graph of the conditional covariance graph of this single-factor model is G9 * X(j).j, to which Theorem 1 applies. % A similar result was proved by Anderson & Rubin (1956, Theorem 5 4) for uncorrelated residuals. 4. MULTI-FACTOR MODELS 4 1. Introduction In this section, we give conditions for identification of multi-factor models using the conditions for identification of single-factor models. Submodels with fewer factors can be derived from a multi-factor model by marginalising over and conditioning on subsets of factors. The associated graphs of these submodels are called summary graphs (Cox & Wermuth, 1996, p. 196). Let j, j and j be a partition of the set of factors j. We consider the submodel derived s m c from (1) by conditioning on j and marginalising over j. c m Thus, only the observed variables X and the subset of factors j remain. The edges of the s graph of this model can easily be deduced from those of the complete graph: the set of the directed edges E is obtained by removing from E the edges coming from a X js X j factor either over which we marginalise or on which we condition; its set of undirected edges E is obtained by adding to E the edges connecting the pairs of nodes with a s common factor in j. m In particular, we focus on submodels where j is a single factor in the sequence. As s indicated in 3 3, the conditional covariance graphs induced by the children of the single factor are of particular interest; they are termed induced conditional covariance graphs of X(j ) conditional on j marginalised over j. s c m

6 146 M. GRZEBYK, P. WILD AND D. CHOUANIÈRE 4 2. A necessary condition for identification of a multi-factor model THEOREM 2. A necessary condition for a multi-factor model with uncorrelated factors to be identified is that all the single-factor models be identified which are obtained by conditioning on all factors but one. Equivalently, if at least one single-factor model, obtained by conditioning on all factors but one, is not identified then the multi-factor model is not identified. Proof. As these two conditions are equivalent, we only prove the second. Let G be a multi-factor model with uncorrelated factors and parameters {L, H}. Suppose that the single-factor model obtained by conditioning on all but factor j i is not identified. Let L i be the vector of factor loadings of j i and let Li be the q ( p 1) submatrix of the factor loadings of the remaining factors. Then {L i, H} is a set of parameters of a single-factor model. As it is not identified, there exists another set of parameters {l i, h} such that L i LT i +H=l i lt i +h. Then LiLiT+L i LT i +H=LiLiT+l i lt i +h, which means that {l i, Li, h} is another solution for the set of parameters of the initial multi-factor model. % This condition is illustrated in the following example. Figure 1(a) represents the chain graph of a multi-factor model with seven observed variables and two factors. The induced conditional covariance graphs of the two single-factor models are represented in Figs 1(b), (c). As the properties of the complementary graphs are the most important, we draw the edges of the complementary graph with full lines together with the dashed lines of the covariance graph. With this convention, we can easily check the existence of odd cycles in the full-lines graph. Fig. 1. A multi-factor model with 7 observed variables and 2 factors. The chain graph is shown in (a). ( b) and (c) show the induced conditional covariance graphs (dashed lines) and their complementary graphs (full lines) of the two single-factor models obtained by conditioning on j 1 and j 2 respectively. The single-factor model induced by conditioning on j 1 is identified but the single-factor model induced by conditioning on j 2 is not identified. Thus this model is not identified according to Theorem 2.

7 Multi-factor models 147 For example, in Fig. 1(c), the complementary graph of the induced conditional covariance graph of X(j ) does not contain an odd cycle, so that the multi-factor model is not identified. 1 This condition is not sufficient, as the following example shows. The chain graph of a multi-factor model with six observed variables and two factors is presented in Fig. 2(a). Both complementary graphs of the induced conditional covariance graphs of X(j ) conditional 1 on {j, j } and X(j ) conditional on {j, j } contain odd cycles, see Figs 2(b), (c), but the multi-factor model is not identified. To show this, we compare the number of unknowns and the number of equations. The number of nonzero covariances in S(l, H) is called the apparent number of equations. This is not always the actual number of independent equations as some equations may be redundant. Fig. 2. A multi-factor model with 6 observed variables and 2 factors. The chain graph is shown in (a). ( b) and (c) show the induced conditional covariance graphs (dashed lines) and their complementary graphs (full lines) of the two single-factor models obtained by conditioning on j 1 and j 2 respectively. Both single-factor models obtained by conditioning on one factor are identified, but the multi-factor model is not identified. The model in Fig. 2 has 17 parameters and the apparent number of equations is 18. However, two equations are redundant because the model imposes the two tetrad conditions, cov (X,X ) cov (X,X ) cov (X,X ) cov (X,X ) =1, cov (X,X ) cov (X,X ) cov (X,X ) cov (X,X ) = Thus, the actual number of equations is 16, which is lower than the number of unknowns. COROLLARY 2. A necessary condition for identification of a multi-factor model with uncorrelated factors is that each factor have at least three children. Furthermore, if there exists at least a residual correlation within the set of the children of any given factor, the number of children must be at least four.

8 148 M. GRZEBYK, P. WILD AND D. CHOUANIÈRE Proof. Suppose that a factor has fewer than three children. Then the single-factor model obtained by conditioning on all but this factor is not identified since the complementary graph of its conditional covariance graph has only one or two nodes and thus its associated graph cannot contain any odd cycle. If there exists a correlation within the children of a factor, the complementary graph of the induced conditional covariance graph of the single-factor model obtained by conditioning on all but this factor cannot have any odd cycle if the number of children is either 1, 2 or 3. % This result was proved by Anderson & Rubin (1956) in the case of uncorrelated errors A suycient condition for identification of a multi-factor model THEOREM 3. AsuYcient condition for identification of a multi-factor model with uncorrelated factors is that there exists at least one sequence of factors, j, such that each single-factor p(i) model obtained by conditioning on {j,j<i} and marginalising on {j,j>i} is identified. p(j) p(j) Proof. We show that, if such a permutation exists, the identification is reached successively for each vector of factor loadings l. In order to simplify notation, the factors are reordered p(i) such that p(i)=i. Let {L, H} be a set of parameters of a multi-factor model. Let {l, h} be a second set of parameters of the same model. By (2) the equation of identification of the model is LLT+H=llT+h. (11) In the proof, we denote by L the ith column of the matrix L and make use of the i general property that LLT=W L LT. i=1 i i First we consider identification of l. Let H =W L LT+H and h =W l lt+h. 1 i=2 i i i=2 i i Equation (11) can be rewritten as L LT+H =l lt+h. (12) This equation corresponds to the identification equation of the single-factor model obtained by marginalising over all but factor j. As it is supposed to be identified, l =L Next we consider identification of l. We suppose that l =L for all i<j. Then (11) is j i i reduced to L LT+H= l lt+h. (13) i i i i i=j i=j This equation is the identification equation of the factor model with p j+1 factors obtained by conditioning on the j 1 first factors. As for the identification of l,welet 1 H =W L LT+H and h =W l lt+h. Then (13) can be rewritten as i=j+1 i i i=j+1 i i L LT+H =l lt+h. (14) j j j j This equation corresponds to the identification equation of the single-factor model obtained by conditioning on {j,j<i} and marginalising on {j,j>i}. Again, as this single-factor p(j) p(j) model is supposed to be identififed, l =L. j j The identification process continues until j=p 1. Then l =L for i=1,...,p 1 i i and (11) is reduced to L LT+H=l lt+h. (15) p p p p This equation is the equation of identification of the single-factor model obtained by conditioning on all but factor j, which is supposed to be identified. Thus L =l and p p p H=h. %

9 Multi-factor models 149 This condition is illustrated by the following example. We consider a multi-factor model with six observed variables and three factors whose chain graph is given in Fig. 3(a). This multi-factor model is identified by Theorem 3; the sequence (j, j, j ) identifies the parameters. The induced conditional covariance graphs of the successive single-factor models resulting from the sequence are given in Fig. 3(b). Fig. 3. A multi-factor model with 6 observed variables and 3 factors. The chain graph is shown in (a). ( b) shows the induced conditional covariance graphs (dashed lines) and their complementary graphs (full lines) of the sequence of single-factor models illustrating the operations that prove the identification of the model using the sequence (j 2, j 1, j 3 ). Fig. 4. A multi-factor model with 7 observed variables and 2 factors. The chain graph is shown in (a). ( b) shows the induced conditional covariance graph (dashed lines) and its complementary graph (full lines) of the single-factor model obtained by marginalising over j 1 for the tested sequence (j 2, j 1 ). (c) shows the corresponding graph of the single-factor model obtained by marginalising over j 2 for the tested sequence (j 1, j 2 ). None of these single-factor models is identified. However this multi-factor model is identified.

10 150 M. GRZEBYK, P. WILD AND D. CHOUANIÈRE The following example proves that this condition is not necessary. We consider a multi-factor model with seven observed variables and two factors. The structural constraints are presented in the chain graph in Fig. 4(a). Neither sequence (j, j ) nor (j, j ) follows the condition of Theorem 3 since none of the single-factor models obtained by marginalising over j or j is identified. However, the identification has been checked 1 2 algebraically. Interested readers can contact M. Grzebyk for the full proof. 5. DISCUSSION An issue with these conditions is their computational complexity. We note first that Vicard s (2000) algorithm is linear in E*, where E* is the number of edges in G*. For the necessary condition of identification of a multi-factor model, the algorithm builds the induced conditional covariance graphs of all the single-factor submodels obtained by conditioning on all but one factor and checks that they follow the conditions of identification of Theorem 1, using the core algorithm. Thus the computational complexity of the algorithm is linear in p ( E +K), where K is the complexity of the algorithm that derives the induced conditional covariance graphs of a single-factor submodel obtained by conditioning on all but one factor. In the case of the sufficient condition for identification of a multi-factor model, the algorithm explores the set of permutations to look for a path among the latent variables which identifies the model. Thus, if the sufficient condition is not fulfilled, the parsing of all paths involves p! individual searches each of which consists of at most p generations of induced conditional covariance graphs of a single-factor model and then checks for odd cycles. If a particular graph is to be checked for identifiability on the basis of the present results, the necessary condition should be checked first as it is simpler and shorter. If all these single-factor sub-models are identified and no permutation of the factors is found by which the identification can be shown through Theorem 3, the identification status of the model is still pending since it was shown that none of these conditions is both necessary and sufficient. Note that, in the example presented in Fig. 2, the apparent number of equations was sufficient for identification but the model was not identified through Theorem 2, which at least in this case is a stronger criterion. The second part of the condition in Theorem 1 indicates that fixing one sign constraint for each factor is not sufficient for solving the invariance problem when there is more than one connectivity component in the graph E9 of a single-factor model. In this case, the sign of the factor loadings can be fixed independently in each connectivity component; these alternative choices lead to different solutions for the covariance matrix H, more precisely for the residual covariances between each pair of elements of X belonging to different connectivity components. Note that, in contrast to the factor loadings, these different solutions for H are not equal up to a sign-change: they may correspond to truly different values. Thus the sign constraints have to be chosen carefully. This phenomenon is crucial too in the case of a multi-factor model. Suppose there exists a sequence such that Theorem 3 applies. If the complementary graph of the induced conditional covariance graph of some single-factor model obtained by conditioning on {j,j<i} and marginalising over p(j) {j,j>i} contains more than one connectivity component, choosing different sign p(j) constraints leads to different solutions for the factor loadings of the subsequent factors as well as different solutions for the residual covariances. Note that this problem does not arise in uncorrelated factor models since then the complementary graph of the conditional covariance graph is connected.

11 Multi-factor models 151 Beyond the diagnosis of the identification of a given model, we saw in Corollary 2 that a first necessary condition for identification of a latent variable in particular studies is that each of these latent variables must be characterised by at least three observed variables; that is, if one wants to explore a latent trait in a real-world study, for instance in neurotoxicology, it must be explored through at least three observed variables. Furthermore, if these observed variables are still correlated, conditionally on the latent trait, more than three observed variables are needed. ACKNOWLEDGEMENT We acknowledge the help of the editor and of the two anonymous referees whose advice improved our paper significantly. REFERENCES ANDERSON, T. W. & RUBIN, H. (1956). Statistical inference in factor analysis. In Proc. 3rd Berkeley Symp. Math. Statist. Prob. 5, Ed. J. Neyman, pp Berkeley, CA: Univ. of California Press. COX, D. & WERMUTH, N. (1996). Multivariate Dependencies. Models, Analysis and Interpretation. London: Chapman and Hall. GIUDICI, P. & STANGHELLINI, E. (2001). Bayesian inference for graphical factor analysis models. Psychometrika 66, MARCHETTI, G. M. & STANGHELLINI, E. (1996). Alcune osservazioni sui modelli grafici in presenza di variabili latenti. In Atti della XXXVIII Riunione Scientifica 2, Ed. Società Italiana di Statistica, pp Rome, Italy: Maggioli Editore. STANGHELLINI, E. (1997). Identification of a single-factor model using graphical Gaussian rules. Biometrika 84, VICARD, P. (2000). On identification of a single-factor model with correlated residuals. Biometrika 87, [Received April Revised September 2003]

Directed acyclic graphs and the use of linear mixed models

Directed acyclic graphs and the use of linear mixed models Directed acyclic graphs and the use of linear mixed models Siem H. Heisterkamp 1,2 1 Groningen Bioinformatics Centre, University of Groningen 2 Biostatistics and Research Decision Sciences (BARDS), MSD,

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Problem Set 3 Issued: Thursday, September 25, 2014 Due: Thursday,

More information

MATH 829: Introduction to Data Mining and Analysis Graphical Models I

MATH 829: Introduction to Data Mining and Analysis Graphical Models I MATH 829: Introduction to Data Mining and Analysis Graphical Models I Dominique Guillot Departments of Mathematical Sciences University of Delaware May 2, 2016 1/12 Independence and conditional independence:

More information

Causal Effect Identification in Alternative Acyclic Directed Mixed Graphs

Causal Effect Identification in Alternative Acyclic Directed Mixed Graphs Proceedings of Machine Learning Research vol 73:21-32, 2017 AMBN 2017 Causal Effect Identification in Alternative Acyclic Directed Mixed Graphs Jose M. Peña Linköping University Linköping (Sweden) jose.m.pena@liu.se

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

Undirected Graphical Models

Undirected Graphical Models Undirected Graphical Models 1 Conditional Independence Graphs Let G = (V, E) be an undirected graph with vertex set V and edge set E, and let A, B, and C be subsets of vertices. We say that C separates

More information

Parameter Redundancy with Covariates

Parameter Redundancy with Covariates Biometrika (2010), xx, x, pp. 1 9 1 2 3 4 5 6 7 C 2007 Biometrika Trust Printed in Great Britain Parameter Redundancy with Covariates By D. J. Cole and B. J. T. Morgan School of Mathematics, Statistics

More information

9 Graphical modelling of dynamic relationships in multivariate time series

9 Graphical modelling of dynamic relationships in multivariate time series 9 Graphical modelling of dynamic relationships in multivariate time series Michael Eichler Institut für Angewandte Mathematik Universität Heidelberg Germany SUMMARY The identification and analysis of interactions

More information

Supplementary Material for Causal Discovery of Linear Cyclic Models from Multiple Experimental Data Sets with Overlapping Variables

Supplementary Material for Causal Discovery of Linear Cyclic Models from Multiple Experimental Data Sets with Overlapping Variables Supplementary Material for Causal Discovery of Linear Cyclic Models from Multiple Experimental Data Sets with Overlapping Variables Antti Hyttinen HIIT & Dept of Computer Science University of Helsinki

More information

Identification of discrete concentration graph models with one hidden binary variable

Identification of discrete concentration graph models with one hidden binary variable Bernoulli 19(5A), 2013, 1920 1937 DOI: 10.3150/12-BEJ435 Identification of discrete concentration graph models with one hidden binary variable ELENA STANGHELLINI 1 and BARBARA VANTAGGI 2 1 D.E.F.S., Università

More information

Instrumental Sets. Carlos Brito. 1 Introduction. 2 The Identification Problem

Instrumental Sets. Carlos Brito. 1 Introduction. 2 The Identification Problem 17 Instrumental Sets Carlos Brito 1 Introduction The research of Judea Pearl in the area of causality has been very much acclaimed. Here we highlight his contributions for the use of graphical languages

More information

CSC Discrete Math I, Spring Relations

CSC Discrete Math I, Spring Relations CSC 125 - Discrete Math I, Spring 2017 Relations Binary Relations Definition: A binary relation R from a set A to a set B is a subset of A B Note that a relation is more general than a function Example:

More information

On the Identification of a Class of Linear Models

On the Identification of a Class of Linear Models On the Identification of a Class of Linear Models Jin Tian Department of Computer Science Iowa State University Ames, IA 50011 jtian@cs.iastate.edu Abstract This paper deals with the problem of identifying

More information

Markovian Combination of Decomposable Model Structures: MCMoSt

Markovian Combination of Decomposable Model Structures: MCMoSt Markovian Combination 1/45 London Math Society Durham Symposium on Mathematical Aspects of Graphical Models. June 30 - July, 2008 Markovian Combination of Decomposable Model Structures: MCMoSt Sung-Ho

More information

arxiv: v1 [math.co] 13 May 2016

arxiv: v1 [math.co] 13 May 2016 GENERALISED RAMSEY NUMBERS FOR TWO SETS OF CYCLES MIKAEL HANSSON arxiv:1605.04301v1 [math.co] 13 May 2016 Abstract. We determine several generalised Ramsey numbers for two sets Γ 1 and Γ 2 of cycles, in

More information

Total positivity in Markov structures

Total positivity in Markov structures 1 based on joint work with Shaun Fallat, Kayvan Sadeghi, Caroline Uhler, Nanny Wermuth, and Piotr Zwiernik (arxiv:1510.01290) Faculty of Science Total positivity in Markov structures Steffen Lauritzen

More information

A MCMC Approach for Learning the Structure of Gaussian Acyclic Directed Mixed Graphs

A MCMC Approach for Learning the Structure of Gaussian Acyclic Directed Mixed Graphs A MCMC Approach for Learning the Structure of Gaussian Acyclic Directed Mixed Graphs Ricardo Silva Abstract Graphical models are widely used to encode conditional independence constraints and causal assumptions,

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence

Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence Graphical models and causality: Directed acyclic graphs (DAGs) and conditional (in)dependence General overview Introduction Directed acyclic graphs (DAGs) and conditional independence DAGs and causal effects

More information

Definition: A binary relation R from a set A to a set B is a subset R A B. Example:

Definition: A binary relation R from a set A to a set B is a subset R A B. Example: Chapter 9 1 Binary Relations Definition: A binary relation R from a set A to a set B is a subset R A B. Example: Let A = {0,1,2} and B = {a,b} {(0, a), (0, b), (1,a), (2, b)} is a relation from A to B.

More information

Learning Multivariate Regression Chain Graphs under Faithfulness

Learning Multivariate Regression Chain Graphs under Faithfulness Sixth European Workshop on Probabilistic Graphical Models, Granada, Spain, 2012 Learning Multivariate Regression Chain Graphs under Faithfulness Dag Sonntag ADIT, IDA, Linköping University, Sweden dag.sonntag@liu.se

More information

ON THE CORE OF A GRAPHf

ON THE CORE OF A GRAPHf ON THE CORE OF A GRAPHf By FRANK HARARY and MICHAEL D. PLUMMER [Received 8 October 1965] 1. Introduction Let G be a graph. A set of points M is said to cover all the lines of G if every line of G has at

More information

Introduction to Probabilistic Graphical Models

Introduction to Probabilistic Graphical Models Introduction to Probabilistic Graphical Models Sargur Srihari srihari@cedar.buffalo.edu 1 Topics 1. What are probabilistic graphical models (PGMs) 2. Use of PGMs Engineering and AI 3. Directionality in

More information

Algebraic Representations of Gaussian Markov Combinations

Algebraic Representations of Gaussian Markov Combinations Submitted to the Bernoulli Algebraic Representations of Gaussian Markov Combinations M. SOFIA MASSA 1 and EVA RICCOMAGNO 2 1 Department of Statistics, University of Oxford, 1 South Parks Road, Oxford,

More information

Review: Directed Models (Bayes Nets)

Review: Directed Models (Bayes Nets) X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected

More information

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks) (Bayesian Networks) Undirected Graphical Models 2: Use d-separation to read off independencies in a Bayesian network Takes a bit of effort! 1 2 (Markov networks) Use separation to determine independencies

More information

Bayesian Estimation of Prediction Error and Variable Selection in Linear Regression

Bayesian Estimation of Prediction Error and Variable Selection in Linear Regression Bayesian Estimation of Prediction Error and Variable Selection in Linear Regression Andrew A. Neath Department of Mathematics and Statistics; Southern Illinois University Edwardsville; Edwardsville, IL,

More information

On the adjacency matrix of a block graph

On the adjacency matrix of a block graph On the adjacency matrix of a block graph R. B. Bapat Stat-Math Unit Indian Statistical Institute, Delhi 7-SJSS Marg, New Delhi 110 016, India. email: rbb@isid.ac.in Souvik Roy Economics and Planning Unit

More information

THE NUMBER OF LOCALLY RESTRICTED DIRECTED GRAPHS1

THE NUMBER OF LOCALLY RESTRICTED DIRECTED GRAPHS1 THE NUMBER OF LOCALLY RESTRICTED DIRECTED GRAPHS1 LEO KATZ AND JAMES H. POWELL 1. Preliminaries. We shall be concerned with finite graphs of / directed lines on n points, or nodes. The lines are joins

More information

Graphical Models 359

Graphical Models 359 8 Graphical Models Probabilities play a central role in modern pattern recognition. We have seen in Chapter 1 that probability theory can be expressed in terms of two simple equations corresponding to

More information

The Equivalence of Ergodicity and Weak Mixing for Infinitely Divisible Processes1

The Equivalence of Ergodicity and Weak Mixing for Infinitely Divisible Processes1 Journal of Theoretical Probability. Vol. 10, No. 1, 1997 The Equivalence of Ergodicity and Weak Mixing for Infinitely Divisible Processes1 Jan Rosinski2 and Tomasz Zak Received June 20, 1995: revised September

More information

Based on slides by Richard Zemel

Based on slides by Richard Zemel CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we

More information

On improving matchings in trees, via bounded-length augmentations 1

On improving matchings in trees, via bounded-length augmentations 1 On improving matchings in trees, via bounded-length augmentations 1 Julien Bensmail a, Valentin Garnero a, Nicolas Nisse a a Université Côte d Azur, CNRS, Inria, I3S, France Abstract Due to a classical

More information

Factorization of integer-valued polynomials with square-free denominator

Factorization of integer-valued polynomials with square-free denominator accepted by Comm. Algebra (2013) Factorization of integer-valued polynomials with square-free denominator Giulio Peruginelli September 9, 2013 Dedicated to Marco Fontana on the occasion of his 65th birthday

More information

10708 Graphical Models: Homework 2

10708 Graphical Models: Homework 2 10708 Graphical Models: Homework 2 Due Monday, March 18, beginning of class Feburary 27, 2013 Instructions: There are five questions (one for extra credit) on this assignment. There is a problem involves

More information

Graphical Models and Independence Models

Graphical Models and Independence Models Graphical Models and Independence Models Yunshu Liu ASPITRG Research Group 2014-03-04 References: [1]. Steffen Lauritzen, Graphical Models, Oxford University Press, 1996 [2]. Christopher M. Bishop, Pattern

More information

Row and Column Distributions of Letter Matrices

Row and Column Distributions of Letter Matrices College of William and Mary W&M ScholarWorks Undergraduate Honors Theses Theses, Dissertations, & Master Projects 5-2016 Row and Column Distributions of Letter Matrices Xiaonan Hu College of William and

More information

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo

A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A lower bound for the Laplacian eigenvalues of a graph proof of a conjecture by Guo A. E. Brouwer & W. H. Haemers 2008-02-28 Abstract We show that if µ j is the j-th largest Laplacian eigenvalue, and d

More information

Dependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline.

Dependence. MFM Practitioner Module: Risk & Asset Allocation. John Dodson. September 11, Dependence. John Dodson. Outline. MFM Practitioner Module: Risk & Asset Allocation September 11, 2013 Before we define dependence, it is useful to define Random variables X and Y are independent iff For all x, y. In particular, F (X,Y

More information

Probabilistic Graphical Models (I)

Probabilistic Graphical Models (I) Probabilistic Graphical Models (I) Hongxin Zhang zhx@cad.zju.edu.cn State Key Lab of CAD&CG, ZJU 2015-03-31 Probabilistic Graphical Models Modeling many real-world problems => a large number of random

More information

Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J

Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J Central Groupoids, Central Digraphs, and Zero-One Matrices A Satisfying A 2 = J Frank Curtis, John Drew, Chi-Kwong Li, and Daniel Pragel September 25, 2003 Abstract We study central groupoids, central

More information

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS

Part I. C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Part I C. M. Bishop PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS Probabilistic Graphical Models Graphical representation of a probabilistic model Each variable corresponds to a

More information

Section Summary. Relations and Functions Properties of Relations. Combining Relations

Section Summary. Relations and Functions Properties of Relations. Combining Relations Chapter 9 Chapter Summary Relations and Their Properties n-ary Relations and Their Applications (not currently included in overheads) Representing Relations Closures of Relations (not currently included

More information

Estimation of Unique Variances Using G-inverse Matrix in Factor Analysis

Estimation of Unique Variances Using G-inverse Matrix in Factor Analysis International Mathematical Forum, 3, 2008, no. 14, 671-676 Estimation of Unique Variances Using G-inverse Matrix in Factor Analysis Seval Süzülmüş Osmaniye Korkut Ata University Vocational High School

More information

Technische Universität Dresden Institute of Numerical Mathematics

Technische Universität Dresden Institute of Numerical Mathematics Technische Universität Dresden Institute of Numerical Mathematics An Improved Flow-based Formulation and Reduction Principles for the Minimum Connectivity Inference Problem Muhammad Abid Dar Andreas Fischer

More information

Near-domination in graphs

Near-domination in graphs Near-domination in graphs Bruce Reed Researcher, Projet COATI, INRIA and Laboratoire I3S, CNRS France, and Visiting Researcher, IMPA, Brazil Alex Scott Mathematical Institute, University of Oxford, Oxford

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Fractional and circular 1-defective colorings of outerplanar graphs

Fractional and circular 1-defective colorings of outerplanar graphs AUSTRALASIAN JOURNAL OF COMBINATORICS Volume 6() (05), Pages Fractional and circular -defective colorings of outerplanar graphs Zuzana Farkasová Roman Soták Institute of Mathematics Faculty of Science,

More information

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES

ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES ON COST MATRICES WITH TWO AND THREE DISTINCT VALUES OF HAMILTONIAN PATHS AND CYCLES SANTOSH N. KABADI AND ABRAHAM P. PUNNEN Abstract. Polynomially testable characterization of cost matrices associated

More information

Improper Solutions in Exploratory Factor Analysis: Causes and Treatments

Improper Solutions in Exploratory Factor Analysis: Causes and Treatments Improper Solutions in Exploratory Factor Analysis: Causes and Treatments Yutaka Kano Faculty of Human Sciences, Osaka University Suita, Osaka 565, Japan. email: kano@hus.osaka-u.ac.jp Abstract: There are

More information

Linear estimation in models based on a graph

Linear estimation in models based on a graph Linear Algebra and its Applications 302±303 (1999) 223±230 www.elsevier.com/locate/laa Linear estimation in models based on a graph R.B. Bapat * Indian Statistical Institute, New Delhi 110 016, India Received

More information

Criteria for existence of semigroup homomorphisms and projective rank functions. George M. Bergman

Criteria for existence of semigroup homomorphisms and projective rank functions. George M. Bergman Criteria for existence of semigroup homomorphisms and projective rank functions George M. Bergman Suppose A, S, and T are semigroups, e: A S and f: A T semigroup homomorphisms, and X a generating set for

More information

Intrinsic products and factorizations of matrices

Intrinsic products and factorizations of matrices Available online at www.sciencedirect.com Linear Algebra and its Applications 428 (2008) 5 3 www.elsevier.com/locate/laa Intrinsic products and factorizations of matrices Miroslav Fiedler Academy of Sciences

More information

The 3 Indeterminacies of Common Factor Analysis

The 3 Indeterminacies of Common Factor Analysis The 3 Indeterminacies of Common Factor Analysis James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) The 3 Indeterminacies of Common

More information

A new distribution on the simplex containing the Dirichlet family

A new distribution on the simplex containing the Dirichlet family A new distribution on the simplex containing the Dirichlet family A. Ongaro, S. Migliorati, and G.S. Monti Department of Statistics, University of Milano-Bicocca, Milano, Italy; E-mail for correspondence:

More information

Laplacian Integral Graphs with Maximum Degree 3

Laplacian Integral Graphs with Maximum Degree 3 Laplacian Integral Graphs with Maximum Degree Steve Kirkland Department of Mathematics and Statistics University of Regina Regina, Saskatchewan, Canada S4S 0A kirkland@math.uregina.ca Submitted: Nov 5,

More information

DISCRIMINANTS, SYMMETRIZED GRAPH MONOMIALS, AND SUMS OF SQUARES

DISCRIMINANTS, SYMMETRIZED GRAPH MONOMIALS, AND SUMS OF SQUARES DISCRIMINANTS, SYMMETRIZED GRAPH MONOMIALS, AND SUMS OF SQUARES PER ALEXANDERSSON AND BORIS SHAPIRO Abstract. Motivated by the necessities of the invariant theory of binary forms J. J. Sylvester constructed

More information

Graph coloring, perfect graphs

Graph coloring, perfect graphs Lecture 5 (05.04.2013) Graph coloring, perfect graphs Scribe: Tomasz Kociumaka Lecturer: Marcin Pilipczuk 1 Introduction to graph coloring Definition 1. Let G be a simple undirected graph and k a positive

More information

Indicator Functions and the Algebra of the Linear-Quadratic Parametrization

Indicator Functions and the Algebra of the Linear-Quadratic Parametrization Biometrika (2012), 99, 1, pp. 1 12 C 2012 Biometrika Trust Printed in Great Britain Advance Access publication on 31 July 2012 Indicator Functions and the Algebra of the Linear-Quadratic Parametrization

More information

Parameterized Domination in Circle Graphs

Parameterized Domination in Circle Graphs Parameterized Domination in Circle Graphs Nicolas Bousquet 1, Daniel Gonçalves 1, George B. Mertzios 2, Christophe Paul 1, Ignasi Sau 1, and Stéphan Thomassé 3 1 AlGCo project-team, CNRS, LIRMM, Montpellier,

More information

A bad example for the iterative rounding method for mincost k-connected spanning subgraphs

A bad example for the iterative rounding method for mincost k-connected spanning subgraphs A bad example for the iterative rounding method for mincost k-connected spanning subgraphs Ashkan Aazami a, Joseph Cheriyan b,, Bundit Laekhanukit c a Dept. of Comb. & Opt., U. Waterloo, Waterloo ON Canada

More information

6.3 How the Associational Criterion Fails

6.3 How the Associational Criterion Fails 6.3. HOW THE ASSOCIATIONAL CRITERION FAILS 271 is randomized. We recall that this probability can be calculated from a causal model M either directly, by simulating the intervention do( = x), or (if P

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning Undirected Graphical Models Mark Schmidt University of British Columbia Winter 2016 Admin Assignment 3: 2 late days to hand it in today, Thursday is final day. Assignment 4:

More information

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014 Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 Recitation 3 1 Gaussian Graphical Models: Schur s Complement Consider

More information

Binary Convolutional Codes of High Rate Øyvind Ytrehus

Binary Convolutional Codes of High Rate Øyvind Ytrehus Binary Convolutional Codes of High Rate Øyvind Ytrehus Abstract The function N(r; ; d free ), defined as the maximum n such that there exists a binary convolutional code of block length n, dimension n

More information

Graphical Models with Symmetry

Graphical Models with Symmetry Wald Lecture, World Meeting on Probability and Statistics Istanbul 2012 Sparse graphical models with few parameters can describe complex phenomena. Introduce symmetry to obtain further parsimony so models

More information

Eigenvectors Via Graph Theory

Eigenvectors Via Graph Theory Eigenvectors Via Graph Theory Jennifer Harris Advisor: Dr. David Garth October 3, 2009 Introduction There is no problem in all mathematics that cannot be solved by direct counting. -Ernst Mach The goal

More information

LOG-MULTIPLICATIVE ASSOCIATION MODELS AS LATENT VARIABLE MODELS FOR NOMINAL AND0OR ORDINAL DATA. Carolyn J. Anderson* Jeroen K.

LOG-MULTIPLICATIVE ASSOCIATION MODELS AS LATENT VARIABLE MODELS FOR NOMINAL AND0OR ORDINAL DATA. Carolyn J. Anderson* Jeroen K. 3 LOG-MULTIPLICATIVE ASSOCIATION MODELS AS LATENT VARIABLE MODELS FOR NOMINAL AND0OR ORDINAL DATA Carolyn J. Anderson* Jeroen K. Vermunt Associations between multiple discrete measures are often due to

More information

Identifying Graph Automorphisms Using Determining Sets

Identifying Graph Automorphisms Using Determining Sets Identifying Graph Automorphisms Using Determining Sets Debra L. Boutin Department of Mathematics Hamilton College, Clinton, NY 13323 dboutin@hamilton.edu Submitted: May 31, 2006; Accepted: Aug 22, 2006;

More information

STAT 730 Chapter 9: Factor analysis

STAT 730 Chapter 9: Factor analysis STAT 730 Chapter 9: Factor analysis Timothy Hanson Department of Statistics, University of South Carolina Stat 730: Multivariate Data Analysis 1 / 15 Basic idea Factor analysis attempts to explain the

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Title Multiple extensions of generalized hexagons related to the simple groups McL and Co3. Author(s) Cuypers,

More information

Learning Marginal AMP Chain Graphs under Faithfulness

Learning Marginal AMP Chain Graphs under Faithfulness Learning Marginal AMP Chain Graphs under Faithfulness Jose M. Peña ADIT, IDA, Linköping University, SE-58183 Linköping, Sweden jose.m.pena@liu.se Abstract. Marginal AMP chain graphs are a recently introduced

More information

MATRIX REPRESENTATIONS AND INDEPENDENCIES IN DIRECTED ACYCLIC GRAPHS

MATRIX REPRESENTATIONS AND INDEPENDENCIES IN DIRECTED ACYCLIC GRAPHS Submitted to the Annals of Statistics MATRIX REPRESENTATIONS AND INDEPENDENCIES IN DIRECTED ACYCLIC GRAPHS By Giovanni M. Marchetti and Nanny Wermuth University of Florence and Chalmers/Göteborgs Universitet

More information

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems

Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Analysis of the AIC Statistic for Optimal Detection of Small Changes in Dynamic Systems Jeremy S. Conner and Dale E. Seborg Department of Chemical Engineering University of California, Santa Barbara, CA

More information

DISJOINT PAIRS OF SETS AND INCIDENCE MATRICES

DISJOINT PAIRS OF SETS AND INCIDENCE MATRICES DISJOINT PAIRS OF SETS AND INCIDENCE MATRICES B MARVIN MARCUS AND HENRK MINC In a recent paper [1] investigating the repeated appearance of zeros in the powers of a mtrix the following purely combinatorial

More information

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Boundary cliques, clique trees and perfect sequences of maximal cliques of a chordal graph

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. Boundary cliques, clique trees and perfect sequences of maximal cliques of a chordal graph MATHEMATICAL ENGINEERING TECHNICAL REPORTS Boundary cliques, clique trees and perfect sequences of maximal cliques of a chordal graph Hisayuki HARA and Akimichi TAKEMURA METR 2006 41 July 2006 DEPARTMENT

More information

Perfect matchings in highly cyclically connected regular graphs

Perfect matchings in highly cyclically connected regular graphs Perfect matchings in highly cyclically connected regular graphs arxiv:1709.08891v1 [math.co] 6 Sep 017 Robert Lukot ka Comenius University, Bratislava lukotka@dcs.fmph.uniba.sk Edita Rollová University

More information

A basic canonical form of discrete-time compartmental systems

A basic canonical form of discrete-time compartmental systems Journal s Title, Vol x, 200x, no xx, xxx - xxx A basic canonical form of discrete-time compartmental systems Rafael Bru, Rafael Cantó, Beatriz Ricarte Institut de Matemàtica Multidisciplinar Universitat

More information

Boolean Inner-Product Spaces and Boolean Matrices

Boolean Inner-Product Spaces and Boolean Matrices Boolean Inner-Product Spaces and Boolean Matrices Stan Gudder Department of Mathematics, University of Denver, Denver CO 80208 Frédéric Latrémolière Department of Mathematics, University of Denver, Denver

More information

RESEARCH ARTICLE. An extension of the polytope of doubly stochastic matrices

RESEARCH ARTICLE. An extension of the polytope of doubly stochastic matrices Linear and Multilinear Algebra Vol. 00, No. 00, Month 200x, 1 15 RESEARCH ARTICLE An extension of the polytope of doubly stochastic matrices Richard A. Brualdi a and Geir Dahl b a Department of Mathematics,

More information

ARTICLE IN PRESS. Journal of Multivariate Analysis ( ) Contents lists available at ScienceDirect. Journal of Multivariate Analysis

ARTICLE IN PRESS. Journal of Multivariate Analysis ( ) Contents lists available at ScienceDirect. Journal of Multivariate Analysis Journal of Multivariate Analysis ( ) Contents lists available at ScienceDirect Journal of Multivariate Analysis journal homepage: www.elsevier.com/locate/jmva Marginal parameterizations of discrete models

More information

Rao s degree sequence conjecture

Rao s degree sequence conjecture Rao s degree sequence conjecture Maria Chudnovsky 1 Columbia University, New York, NY 10027 Paul Seymour 2 Princeton University, Princeton, NJ 08544 July 31, 2009; revised December 10, 2013 1 Supported

More information

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability Lecture Notes 1 Basic Probability Set Theory Elements of Probability Conditional probability Sequential Calculation of Probability Total Probability and Bayes Rule Independence Counting EE 178/278A: Basic

More information

Coloring Vertices and Edges of a Path by Nonempty Subsets of a Set

Coloring Vertices and Edges of a Path by Nonempty Subsets of a Set Coloring Vertices and Edges of a Path by Nonempty Subsets of a Set P.N. Balister E. Győri R.H. Schelp April 28, 28 Abstract A graph G is strongly set colorable if V (G) E(G) can be assigned distinct nonempty

More information

Inferences on a Normal Covariance Matrix and Generalized Variance with Monotone Missing Data

Inferences on a Normal Covariance Matrix and Generalized Variance with Monotone Missing Data Journal of Multivariate Analysis 78, 6282 (2001) doi:10.1006jmva.2000.1939, available online at http:www.idealibrary.com on Inferences on a Normal Covariance Matrix and Generalized Variance with Monotone

More information

Directed Graphical Models

Directed Graphical Models CS 2750: Machine Learning Directed Graphical Models Prof. Adriana Kovashka University of Pittsburgh March 28, 2017 Graphical Models If no assumption of independence is made, must estimate an exponential

More information

Measurement exchangeability and normal one-factor models

Measurement exchangeability and normal one-factor models Biometrika (2004) 91 3 pp 738 742 2004 Biometrika Trust Printed in Great Britain Measurement exchangeability and normal one-factor models BY HENK KELDERMAN Department of Work and Organizational Psychology

More information

A well-quasi-order for tournaments

A well-quasi-order for tournaments A well-quasi-order for tournaments Maria Chudnovsky 1 Columbia University, New York, NY 10027 Paul Seymour 2 Princeton University, Princeton, NJ 08544 June 12, 2009; revised April 19, 2011 1 Supported

More information

Lecture 12: May 09, Decomposable Graphs (continues from last time)

Lecture 12: May 09, Decomposable Graphs (continues from last time) 596 Pat. Recog. II: Introduction to Graphical Models University of Washington Spring 00 Dept. of lectrical ngineering Lecture : May 09, 00 Lecturer: Jeff Bilmes Scribe: Hansang ho, Izhak Shafran(000).

More information

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) =

We simply compute: for v = x i e i, bilinearity of B implies that Q B (v) = B(v, v) is given by xi x j B(e i, e j ) = Math 395. Quadratic spaces over R 1. Algebraic preliminaries Let V be a vector space over a field F. Recall that a quadratic form on V is a map Q : V F such that Q(cv) = c 2 Q(v) for all v V and c F, and

More information

Strongly Regular Decompositions of the Complete Graph

Strongly Regular Decompositions of the Complete Graph Journal of Algebraic Combinatorics, 17, 181 201, 2003 c 2003 Kluwer Academic Publishers. Manufactured in The Netherlands. Strongly Regular Decompositions of the Complete Graph EDWIN R. VAN DAM Edwin.vanDam@uvt.nl

More information

Quivers of Period 2. Mariya Sardarli Max Wimberley Heyi Zhu. November 26, 2014

Quivers of Period 2. Mariya Sardarli Max Wimberley Heyi Zhu. November 26, 2014 Quivers of Period 2 Mariya Sardarli Max Wimberley Heyi Zhu ovember 26, 2014 Abstract A quiver with vertices labeled from 1,..., n is said to have period 2 if the quiver obtained by mutating at 1 and then

More information

ARTICLE IN PRESS Theoretical Computer Science ( )

ARTICLE IN PRESS Theoretical Computer Science ( ) Theoretical Computer Science ( ) Contents lists available at ScienceDirect Theoretical Computer Science journal homepage: www.elsevier.com/locate/tcs Conditional matching preclusion for hypercube-like

More information

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

LECTURE 2 LINEAR REGRESSION MODEL AND OLS SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another

More information

On graphs having a unique minimum independent dominating set

On graphs having a unique minimum independent dominating set AUSTRALASIAN JOURNAL OF COMBINATORICS Volume 68(3) (2017), Pages 357 370 On graphs having a unique minimum independent dominating set Jason Hedetniemi Department of Mathematical Sciences Clemson University

More information

CONSTRUCTION OF NESTED (NEARLY) ORTHOGONAL DESIGNS FOR COMPUTER EXPERIMENTS

CONSTRUCTION OF NESTED (NEARLY) ORTHOGONAL DESIGNS FOR COMPUTER EXPERIMENTS Statistica Sinica 23 (2013), 451-466 doi:http://dx.doi.org/10.5705/ss.2011.092 CONSTRUCTION OF NESTED (NEARLY) ORTHOGONAL DESIGNS FOR COMPUTER EXPERIMENTS Jun Li and Peter Z. G. Qian Opera Solutions and

More information

Causality in Econometrics (3)

Causality in Econometrics (3) Graphical Causal Models References Causality in Econometrics (3) Alessio Moneta Max Planck Institute of Economics Jena moneta@econ.mpg.de 26 April 2011 GSBC Lecture Friedrich-Schiller-Universität Jena

More information

The matrix approach for abstract argumentation frameworks

The matrix approach for abstract argumentation frameworks The matrix approach for abstract argumentation frameworks Claudette CAYROL, Yuming XU IRIT Report RR- -2015-01- -FR February 2015 Abstract The matrices and the operation of dual interchange are introduced

More information

An Introduction to Bayesian Machine Learning

An Introduction to Bayesian Machine Learning 1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems

More information

The Minimum Rank, Inverse Inertia, and Inverse Eigenvalue Problems for Graphs. Mark C. Kempton

The Minimum Rank, Inverse Inertia, and Inverse Eigenvalue Problems for Graphs. Mark C. Kempton The Minimum Rank, Inverse Inertia, and Inverse Eigenvalue Problems for Graphs Mark C. Kempton A thesis submitted to the faculty of Brigham Young University in partial fulfillment of the requirements for

More information