Conditional Independence and Markov Properties

Similar documents
Markov properties for undirected graphs

Markov properties for undirected graphs

Likelihood Analysis of Gaussian Graphical Models

Markov properties for directed graphs

Total positivity in Markov structures

Lecture 4 October 18th

Markov properties for mixed graphs

Faithfulness of Probability Distributions and Graphs

Graphical Models and Independence Models

4.1 Notation and probability review

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

On Markov Properties in Evidence Theory

Decomposable Graphical Gaussian Models

Undirected Graphical Models

MATH 829: Introduction to Data Mining and Analysis Graphical Models I

Probabilistic Graphical Models (I)

Lecture 12: May 09, Decomposable Graphs (continues from last time)

Prof. Dr. Lars Schmidt-Thieme, L. B. Marinho, K. Buza Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course

Probabilistic Reasoning: Graphical Models

Independencies. Undirected Graphical Models 2: Independencies. Independencies (Markov networks) Independencies (Bayesian Networks)

Probabilistic Graphical Models. Rudolf Kruse, Alexander Dockhorn Bayesian Networks 153

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Lecture 17: May 29, 2002

3 : Representation of Undirected GM

Independence, Decomposability and functions which take values into an Abelian Group

Undirected Graphical Models 4 Bayesian Networks and Markov Networks. Bayesian Networks to Markov Networks

Chris Bishop s PRML Ch. 8: Graphical Models

Probability Propagation

Probability Propagation

p L yi z n m x N n xi

Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Algorithms For Inference Fall 2014

CSC 412 (Lecture 4): Undirected Graphical Models

The maximum size of a partial spread in H(4n + 1,q 2 ) is q 2n+1 + 1

Separable and Transitive Graphoids

Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010)

Bayesian Machine Learning - Lecture 7

1 Undirected Graphical Models. 2 Markov Random Fields (MRFs)

Representing Independence Models with Elementary Triplets

arxiv: v4 [math.st] 11 Jul 2017

Machine Learning 4771

CS281A/Stat241A Lecture 19

Causality in Econometrics (3)

6.867 Machine learning, lecture 23 (Jaakkola)

Data Structures for Efficient Inference and Optimization

Probabilistic Graphical Models

Tutorial: Gaussian conditional independence and graphical models. Thomas Kahle Otto-von-Guericke Universität Magdeburg

Part V. Matchings. Matching. 19 Augmenting Paths for Matchings. 18 Bipartite Matching via Flows

Rapid Introduction to Machine Learning/ Deep Learning

Learning Multivariate Regression Chain Graphs under Faithfulness

Observation 4.1 G has a proper separation of order 0 if and only if G is disconnected.

14 : Theory of Variational Inference: Inner and Outer Approximation

Observation 4.1 G has a proper separation of order 0 if and only if G is disconnected.

1. what conditional independencies are implied by the graph. 2. whether these independecies correspond to the probability distribution

Undirected Graphical Models

Algebraic Representations of Gaussian Markov Combinations

Reasoning with Bayesian Networks

Review: Directed Models (Bayes Nets)

13 : Variational Inference: Loopy Belief Propagation

On an Additive Semigraphoid Model for Statistical Networks With Application to Nov Pathway 25, 2016 Analysis -1 Bing / 38Li,

Probabilistic Graphical Models

Belief Update in CLG Bayesian Networks With Lazy Propagation

Decomposable and Directed Graphical Gaussian Models

The following definition is fundamental.

Vertex opposition in spherical buildings

Example: multivariate Gaussian Distribution

Motivation. Bayesian Networks in Epistemology and Philosophy of Science Lecture. Overview. Organizational Issues

Learning Marginal AMP Chain Graphs under Faithfulness

Independence for Full Conditional Measures, Graphoids and Bayesian Networks

Learning discrete graphical models via generalized inverse covariance matrices

10725/36725 Optimization Homework 2 Solutions

Variable Elimination (VE) Barak Sternberg

UC Berkeley Department of Electrical Engineering and Computer Science Department of Statistics. EECS 281A / STAT 241A Statistical Learning Theory

1 Some loose ends from last time

Graphical Models with R

Bayesian & Markov Networks: A unified view

Lecture Notes 1: Vector spaces

Machine Learning Lecture 14

Alternative Parameterizations of Markov Networks. Sargur Srihari

Chapter 1. Sets and Mappings

Symbolic Variable Elimination in Discrete and Continuous Graphical Models. Scott Sanner Ehsan Abbasnejad

A Primer in Econometric Theory

Elements of Graphical Models DRAFT.

On Conditional Independence in Evidence Theory

CPSC 540: Machine Learning

Continuous extension in topological digital spaces

Probabilistic Graphical Models

Lecture 18: March 15

Extension of continuous functions in digital spaces with the Khalimsky topology

Math 3012 Applied Combinatorics Lecture 14

Capturing Independence Graphically; Undirected Graphs

Alternative Parameterizations of Markov Networks. Sargur Srihari

Causal Effect Identification in Alternative Acyclic Directed Mixed Graphs

ECE521 Tutorial 11. Topic Review. ECE521 Winter Credits to Alireza Makhzani, Alex Schwing, Rich Zemel and TAs for slides. ECE521 Tutorial 11 / 4

Statistical Approaches to Learning and Discovery

Rao s degree sequence conjecture

Lecture 15. Probabilistic Models on Graph

Submodular Partition Functions

6.046 Recitation 11 Handout

3 Undirected Graphical Models

Undirected graphical models

Transcription:

Conditional Independence and Markov Properties Lecture 1 Saint Flour Summerschool, July 5, 2006 Steffen L. Lauritzen, University of Oxford

Overview of lectures 1. Conditional independence and Markov properties 2. More on Markov properties 3. Graph decompositions and junction trees 4. Probability propagation and similar algorithms 5. Log-linear and Gaussian graphical models 6. Conjugate prior families for graphical models 7. Hyper Markov laws 8. Structure learning and Bayes factors 9. More on structure learning.

Conditional independence The notion of conditional independence is fundamental for graphical models. For three random variables X, Y and Z we denote this as X Y Z and graphically as X Z Y If the random variables have density w.r.t. a product measure µ, the conditional independence is reflected in the relation f(x, y, z)f(z) = f(x, z)f(y, z), where f is a generic symbol for the densities involved.

Graphical models 2 4 1 5 7 3 6 For several variables, complex systems of conditional independence can be described by undirected graphs. Then a set of variables A is conditionally independent of set B, given the values of a set of variables C if C separates A from B.

A directed graphical model Directed model showing relations between risk factors, diseases, and symptoms.

A pedigree Graphical model for a pedigree from study of Werner s syndrome. Each node is itself a graphical model.

A highly complex pedigree p1379 p1380 p1381 p1382 p1383 p1384 p1385 p1386 p1387 p1129 p1128 p1133 p1134 p1135 p1136 p1137 p1536 p1535 p1131 p1408 p1407 p1406 p1405 p1404 p1126 p1396 p1395 p1394 p1393 p1392 p1391p1390 p1389 p1388 p1127 p1426 p1425 p1424 p1423 p1130 p1565p1564 p1530 p1132 p1073 p1457 p1456 p1455 p1454 p1078 p1349 p1079 p1226 p1225 p1224 p1223 p1222p1221 p1220 p1076 p1242 p1241p1240 p1239 p1238 p1237 p1561 p1236 p1575 p1235 p1499 p1234 p1480 p1511 p1534 p1233 p1219 p1074 p1376 p1375 p1374 p1373 p1372 p1371 p1370 p1369 p1368 p1077 p1232 p1231 p1230 p1229 p1228p1227 p1072 p1438p1437 p1436 p1435 p1434 p1439 p1080 p1362 p1361 p1360 p1190 p1449 p1450 p1451 p1452 p1453 p1187 p1427 p1428 p1429 p1430 p1431 p1432 p1433 p1184 p1189 p1188 p1186 p1185 p1075 p927 p928 p929 p331 p544 p543 p542 p328 p1585 p657 p659 p660 p1556 p1555 p1557 p767 p1514 p1568 p765 p875 p876 p877 p766 p661 p662 p745 p842 p1367 p1366 p1365 p1364 p1363 p746 p747 p658 p506 p744 p742 p743 p512 p712 p709 p711 p784 p783 p782 p781 p780 p779 p778 p704 p864 p843 p706 p856 p855 p710 p870p869 p868 p708 p1465 p1443 p1442 p707 p1422 p705 p511 p760 p701 p702 p1576 p773 p879 p772 p774 p775 p776 p777 p1572 p1571 p840 p703 p507 p716 p509 p510 p508 p505 p1170 p1573 p1167 p1169 p1168 p1166 p1165 p1163 p1162 p1161 p1554p1553 p1164 p964 p1463 p1464 p963 p965 p644 p645 p504 p571 p572 p1488 p1487 p1486 p1173 p1569 p1174 p1171 p1175 p1176 p1177 p1172 p1099 p1183 p1096 p1110 p1109 p1108 p1107 p1547 p1533 p1296 p1295 p1297 p1298 p1299 p1300p1301p1302 p1303 p1105 p1350 p1351 p1352 p1102 p1106 p1104 p1103 p1097 p1098 p1100 p1101 p930 p931 p1591 p1603 p1602 p1601 p1600 p1599 p1598 p1597p1596 p1595 p1594 p1604 p1590 p1589 p1586 p1592 p1587 p932 p933 p503 p327 p541 p1460 p1377 p1580p1581 p1461 p1378 p1081 p1082 p1044 p1045 p540 p937 p621 p622 p750 p749 p625 p624 p759 p1093 p1091 p1090 p1089 p1322 p1321 p1320 p1319 p1318 p1317p1316 p1315 p1314 p1313 p1323 p1092 p623 p626 p1083 p934 p935 p936 p606 p538 p576 p539 p329 p330 p70 p51 p34 p1070 p1459 p1094 p1462 p1095 p1071 p1119 p1118 p1479 p1117 p1116 p1115 p1113 p1112 p1356p1357p1358p1359 p1353p1355 p1354 p1114 p1532 p1542 p1120 p1032 p1159 p1158 p1157 p1475 p1503 p1502 p1501 p1500 p1155 p1515 p1543 p1485 p1154 p1513 p1469 p1153 p1545 p1546 p1156 p1152 p1151 p1160 p1033 p1034 p901 p959 p958 p957 p956 p955 p1069 p1584 p1068 p1468 p1067 p1328 p1066 p1063 p1440 p1441 p1065 p1064 p1061 p1060 p1348 p1347 p1346 p1345 p1344 p1343 p1342 p1341 p1340 p1339 p1062 p954 p1562 p1247 p1484 p1245 p1560 p1559 p1246 p1248 p1249 p1250 p1251 p1252p1253 p1254 p1087 p1084 p1085 p1255 p1256 p1257 p1258 p1259 p1260 p1261 p1262 p1263 p1264 p1086 p1088 p949 p953 p952 p951 p1327 p1324 p1325 p1326 p1001 p1002 p999 p1286 p1507 p1531 p1287 p1288 p1289 p1290 p1291p1292 p1293 p1294 p998 p997 p1312 p1549p1548 p1304 p1582 p1306 p1311 p1310 p1309 p1308 p1307 p1305 p1000 p1409 p1410 p1411 p1412 p1413 p1414 p1415 p1416 p1467 p1148 p1150 p1403 p1400 p1401 p1402 p1149 p950 p902 p1040 p1031 p1041 p903 p904p906 p1039 p1038 p1037 p1036 p1035 p905 p263 p198 p26 p440 p441 p647 p646 p442 p231 p228 p446 p730 p513 p1211 p1212 p1570p1574 p1210 p1213 p1214 p1215 p1216p1217 p519 p518 p517 p516 p515 p514 p445 p600 p599 p748 p601 p602 p444 p1042 p1147 p1397 p1398 p1399 p1141 p1142 p1498 p1497 p1496 p1466 p1143 p1506 p1524p1523 p1522 p1144 p1541 p1505p1504 p1478 p1474 p1145 p1567 p1544 p1146 p1043 p892 p443 p447 p227 p269 p960 p1218 p961 p962 p268 p632 p751 p1420 p1419 p1418 p1417 p1421 p752 p633 p634 p264 p620 p1593 p757 p756 p796 p795 p794 p753 p854 p848 p754 p755 p619 p618 p267 p266 p582 p265 p226 p966 p968 p1284 p1283 p1282p1281 p1280 p1279 p1278 p1277 p1276 p1275 p1274 p1273 p1285 p969 p970 p971 p972 p973 p1209 p1208 p1207 p1206 p1205 p1204 p1203 p1566 p1202 p1470 p1471 p1472 p1200 p1198 p1540 p1539 p1538 p1537 p1199 p1516 p1517 p1518 p1519 p1201 p967 p888 p229 p230 p5 p6 p907 p908p909 p910 p911 p912 p560 p562 p570p569 p561 p563 p106 p340 p181 p439 p438 p1550 p1551 p1552 p736 p731 p732 p846 p803 p802 p801 p800 p799 p798 p797 p733 p734 p735 p737 p536p594 p847 p652 p651 p650 p648 p649 p810 p809 p808p807 p806 p805 p804 p653 p593 p1003 p764 p762p763 p861 p862 p761 p729 p595 p596 p597 p598 p437 p881 p880 p436 p526 p434 p1028 p1448 p1447 p1446 p1445 p1444 p1029 p1030 p476 p700 p699 p698 p697 p849 p696 p818 p819p820 p821 p822 p823 p824 p825 p695 p865 p872 p878 p694 p693 p689 p690 p692 p811 p812 p813 p814 p815 p816 p817 p691 p683 p1022 p1024 p1025 p1528 p1527p1526 p1525 p1529 p1179 p1583 p1563 p1180 p1181 p1178 p1023 p1243 p1244 p1473 p1140 p1138 p1489 p1476 p1139 p1026 p1027 p919 p921 p920 p887 p886 p435 p284 p581 p580 p579 p578 p341 p342 p49 p183 p182 p73 p346 p52 p185 p283 p259 p564 p260 p395 p399 p677 p758 p676 p397 p720 p722 p721 p719 p718 p717 p398 p396 p549 p257 p258 p394 p845 p672 p667 p668 p671 p669 p670 p866 p857 p673 p674 p675 p391 p389 p1111 p390 p393 p392 p261 p262 p27 p28 p252 p474 p640 p643 p826 p827 p828 p829 p830 p831 p832 p839 p642 p641 p639 p256 p410 p400 p412 p401 p414p413 p411 p402 p403 p404 p405 p406 p407 p408 p409 p387 p386 p385 p489 p490 p383 p384 p1059 p1057 p1058 p388 p251 p254 p255 p863 p841p792 p791 p790p789 p788p787 p786 p785 p738 p853 p852 p851 p850 p739 p873 p874 p867 p740 p502 p1579 p1056 p1050 p1048 p1049 p1051 p1495 p1052 p1053 p1054 p1578 p1577 p1055 p501 p678 p793 p682 p681 p680 p679 p500 p498 p499 p253 p241 p475 p242 p943 p974 p975 p976 p977 p978 p979 p890 p1182 p980 p981 p982 p983 p984 p985 p889 p473 p1007 p1008 p1009 p1010 p1011 p1012 p915 p917 p918 p1021 p916 p461 p1013 p891 p243 p15 p16 p535 p306 p532 p713 p556 p714 p715 p531 p529p530 p528 p527 p304 p728 p726 p1477 p1490 p1193 p1491 p1492 p1493 p1494 p1192 p1191 p741 p1194 p727 p555 p305 p36 p37 p148 p525 p3 p431 p885 p884 p429 p883 p882 p430 p638 p1046 p1047 p637 p635 p636 p428 p1020 p1018 p1016 p1512 p1458 p1197 p1017 p1019 p432 p433 p274 p275 p84 p85 p497 p858 p844 p836 p496 p838 p688 p687 p686 p685p684 p491 p723 p494p495 p493 p492 p272 p419 p416 p418 p417 p420 p273 p8 p1014 p1015 p893 p995 p994 p1483 p1482 p1481 p1196 p1195 p996 p894 p895 p896 p232 p449 p603 p605 p604 p448 p382 p768 p769 p770 p771 p381 p380 p379 p378 p630 p629 p628 p627 p376 p617 p612 p614 p1329 p1330 p1331 p1332 p1333 p1334 p1335 p1336 p1337 p1338 p613 p871 p616 p615 p611 p377 p233 p450 p2 p270 p271 p427 p860 p859 p837 p725 p724 p533 p534 p426 p554 p553 p415 p278 p41 p577 p458 p456 p454 p455 p453 p460 p459 p457 p336 p337 p42 p56 p551 p550 p552 p50 p314 p315 p98 p99 p900 p899 p897 p898 p573 p942 p574 p575 p566 p941 p944 p939p940 p565 p567 p568 p296 p297 p92 p93 p343 p352 p351 p350 p344 p104 p105 p76 p246 p349 p472 p471 p470 p469 p468 p466 p609 p610 p607 p608 p467 p465 p244 p425 p421 p423 p424 p422 p245 p247 p17 p18 p1 p520 p656 p655 p664 p663 p665 p835 p834 p833 p666 p654 p521 p631 p922 p923 p546 p548 p547 p545 p926 p925 p924 p522 p279 p79 p80 p299 p39 p40 p477 p478 p479 p480 p287 p312 p353 p311 p332 p333 p523 p524 p310 p288 p89 p90 p43 p187 p55 p67 p174 p69 p1004 p1005 p1006 p913 p1125 p1124 p1123 p1122 p1121 p914 p145 p463 p464 p462 p236 p234 p235 p237 p61 p62 p11 p7 p83 p298 p33 p178 p206 p250 p22 p249 p248 p19 p20 p286 p176 p285 p483 p482 p481 p291 p307 p308 p309 p29 p97 p150 p159 p127 p177 p25 p301 p300 p94 p95 p316 p317 p318 p100 p488 p487 p486 p485 p484 p335 p334 p292 p293 p294 p295 p68 p91 p31 p320 p345 p451 p452 p10 p282 p59 p60 p96 p125 p363 p141 p155 p24 p23 p146 p120 p110 p132 p130 p4 p12 p47 p240 p290 p289p339 p238 p239 p13 p14 p180 p38 p321 p48 p175 p144 p211 p375 p374 p369 p588 p1520 p1521 p586 p1508 p1509 p1510 p584 p1558 p945 p585 p368 p151 p152 p195 p372 p592 p591 p589 p590 p373 p170 p171 p168 p364 p142 p172 p202 p131 p163 p169 p118 p107 p122p119 p989 p1272 p1271 p1270 p1269 p1268 p1267 p1266 p1265 p986 p987 p990 p991 p992 p993 p1614 p1613 p1612 p1611 p1610 p1609 p1608 p1607 p1606 p1605 p988 p938 p143 p537 p46 p280 p281 p87 p88 p276 p277 p86 p72 p203 p116 p123 p109 p138 p164 p200 p225 p192 p947 p1588 p946 p948 p139 p587 p193 p365 p367 p366 p153p154 p194 p189 p147 p124 p219 p191 p126 p128 p583 p362 p190 p137 p188 p165 p157 p173 p114 p160 p149 p207 p112 p224 p319 p35 p9 p121 p204 p201 p216 p166 p133 p208 p220 p217 p371 p370 p134 p135 p108 p117 p162 p111 p221 p115 p158 p205 p167 p215 p213 p196 p197 p223 p214 p209 p179 p113 p212 p222 p129 p210 p161 p184 p103 p156 p218 p557 p558 p559 p313 p44 p199 p53 p140 p82 p303 p302 p21 p30 p45 p136 p322p323p324 p63 p64 p347 p325 p326 p101 p102 p74 p338 p65 p66 p348 p75 p186 p359 p360 p361 p354p355 p356 p357 p358 p77 p78 p81 Family relationship of 1641 members of Greenland Eskimo population.

Conditional independence Random variables X and Y are conditionally independent given the random variable Z if L(X Y, Z) = L(X Z). We then write X Y Z (or X P Y Z) Intuitively: Knowing Z renders Y irrelevant for predicting X. Factorisation of densities w.r.t. product measure: X Y Z f(x, y, z)f(z) = f(x, z)f(y, z) a, b : f(x, y, z) = a(x, z)b(y, z).

Fundamental properties For random variables X, Y, Z, and W it holds (C1) if X Y Z then Y X Z; (C2) if X Y Z and U = g(y ), then X U Z; (C3) if X Y Z and U = g(y ), then X Y (Z, U); (C4) if X Y Z and X W (Y, Z), then X (Y, W ) Z; If density w.r.t. product measure f(x, y, z) > 0 also (C5) if X Y Z and X Z Y then X (Y, Z).

Additional note on (C5) f(x, y, z) > 0 is not necessary for (C5). Enough e.g. that f(y, z) > 0 for all (y, z) or f(x, z) > 0 for all. In discrete and finite case it is even enough that the bipartite graphs G + = (Y Z, E + ) defined by are all connected. y + z f(y, z) > 0, Alternatively it is sufficient if the same condition is satisfied with X replacing Y. Is there a simple necessary and sufficient condition?

Graphoid axioms Ternary relation σ among subsets of a finite set V is graphoid if for all disjoint subsets A, B, C, and D of V : (S1) if A σ B C then B σ A C; (S2) if A σ B C and D B, then A σ D C; (S3) if A σ B C and D B, then A σ B (C D); (S4) if A σ B C and A σ D (B C), then A σ (B D) C; (S5) if A σ B (C D) and A σ C (B D) then A σ (B C) D. Semigraphoid if only (S1) (S4) holds.

Irrelevance Conditional independence can be seen as encoding irrelevance in a fundamental way. With the interpretation: Knowing C, A is irrelevant for learning B, (S1) (S4) translate to: (I1) If, knowing C, learning A is irrelevant for learning B, then B is irrelevant for learning A; (I2) If, knowing C, learning A is irrelevant for learning B, then A is irrelevant for learning any part D of B; (I3) If, knowing C, learning A is irrelevant for learning B, it remains irrelevant having learnt any part D of B;

(I4) If, knowing C, learning A is irrelevant for learning B and, having also learnt A, D remains irrelevant for learning B, then both of A and D are irrelevant for learning B. The property (S5) is slightly more subtle and not generally obvious. Also the symmetry (C1) is a special property of probabilistic conditional independence, rather than of general irrelevance, where (I1) could appear dubious.

Probabilistic semigraphoids V finite set, X = (X v, v V ) random variables. For A V, let X A = (X v, v A). Let X v denote state space of X v. Similarly x A = (x v, v A) X A = v A X v. Abbreviate: A B S X A X B X S. Then basic properties of conditional independence imply: The relation on subsets of V is a semigraphoid. If f(x) > 0 for all x, is also a graphoid. Not all (semi)graphoids are probabilistically representable.

Second order conditional independence Sets of random variables A and B are partially uncorrelated for fixed C if their residuals after linear regression on X C are uncorrelated: Cov{X A E (X A X C ), X B E (X B X C )} = 0, in other words, if the partial correlations are zero We then write A 2 B C. ρ AB C = 0. Also 2 satisfies the semigraphoid axioms (S1) -(S4) and the graphoid axioms if there is no non-trivial linear relation between the variables in V.

Separation in undirected graphs Let G = (V, E) be finite and simple undirected graph (no self-loops, no multiple edges). For subsets A, B, S of V, let A G B S denote that S separates A from B in G, i.e. that all paths from A to B intersect S. Fact: The relation G on subsets of V is a graphoid. This fact is the reason for choosing the name graphoid for such separation relations.

Geometric Orthogonality As another fundamental example, consider geometric orthogonality in Euclidean vector spaces or Hilbert spaces. Let L, M, and N be linear subspaces of a Hilbert space H and define L M N (L N) (M N), where L N = L N. Then L and M are said to meet orthogonally in N. This has properties (O1) If L M N then M L N; (O2) If L M N and U is a linear subspace of L, then U M N;

(O3) If L M N and U is a linear subspace of M, then L M (N + U); (O4) If L M N and L R (M + N), then L (M + R) N. The analogue of (C5) does not hold in general; for example if M = N we may have L M N and L N M, but if L and M are not orthogonal then it is false that L (M + N).

Variation independence Let U X = v V X v and define for S V the S-section U u S of U as U u S = {uv \S : u S = u S, u U}. Define further the conditional independence relation U as A U B C u C : U u C = {U u C }A {U u C }B i.e. if and only if the C-sections all have the form of a product space. The relation U satisfies the semigraphoid axioms. In particular U holds if U is the support of a probability measure satisfying the similar conditional independence restriction.

Markov properties for semigraphoids G = (V, E) simple undirected graph; σ relation. Say σ satisfies (semi)graphoid (P) the pairwise Markov property if α β = α σ β V \ {α, β}; (L) the local Markov property if α V : α σ V \ cl(α) bd(α); (G) the global Markov property if A G B S = A σ B S.

Pairwise Markov property 2 4 1 5 7 3 6 Any non-adjacent pair of random variables are conditionally independent given the remaning. For example, 1 5 {2, 3, 4, 6, 7} and 4 6 {1, 2, 3, 5, 7}.

Local Markov property 2 4 1 5 7 3 6 Every variable is conditionally independent of the remaining, given its neighbours. For example, 5 {1, 4} {2, 3, 6, 7} and 7 {1, 2, 3} {4, 5, 6}.

Global Markov property 2 4 1 5 7 3 6 To find conditional independence relations, one should look for separating sets, such as {2, 3}, {4, 5, 6}, or {2, 5, 6} For example, it follows that 1 7 {2, 5, 6} and 2 6 {3, 4, 5}.

Structural relations among Markov properties For any semigraphoid it holds that (G) = (L) = (P) If σ satisfies graphoid axioms it further holds that (P) = (G) so that in the graphoid case (G) (L) (P). The latter holds in particular for, when f(x) > 0.

(G) = (L) = (P) (G) implies (L) because bd(α) separates α from V \ cl(α). Assume (L). Then β V \ cl(α) because α β. Thus bd(α) ((V \ cl(α)) \ {β}) = V \ {α, β}, Hence by (L) and (S3) we get that α σ (V \ cl(α)) V \ {α, β}. (S2) then gives α σ β V \ {α, β} which is (P).

(P) = (G) for graphoids Asuume (P) and A G B S. We must show A σ B S. Wlog assume A and B non-empty. Proof is reverse induction on n = S. If n = V 2 then A and B are singletons and (P) yields A σ B S directly. Assume S = n < V 2 and conclusion established for S > n. First assume V = A B S. Then either A or B has at least two elements, say A. If α A then B G (A \ {α}) (S {α}) and also α G B (S A \ {α}) (as G is a semi-graphoid).

Thus by the induction hypothesis (A \ {α}) σ B (S {α}) and {α} σ B (S A \ {α}). Now (S5) gives A σ B S. For A B S V we choose α V \ (A B S). Then A G B (S {α}) and hence the induction hypothesis yields A σ B (S {α}). Further, either A S separates B from {α} or B S separates A from {α}. Assuming the former gives α σ B A S. Using (S5) we get (A {α}) σ B S and from (S2) we derive that A σ B S. The latter case is similar.

Factorisation and Markov properties For a V, ψ a (x) is a function depending on x a only, i.e. x a = y a = ψ a (x) = ψ a (y). We can then write ψ a (x) = ψ a (x a ) without ambiguity. The distribution of X factorizes w.r.t. G or satisfies (F) if its density f w.r.t. product measure on X has the form f(x) = a A ψ a (x), where A are complete subsets of G or, equivalently, if f(x) = c C ψ c (x), where C are the cliques of G.

Factorization example 2 4 1 5 7 3 6 The cliques of this graph are the maximal complete subsets {1, 2}, {1, 3}, {2, 4}, {2, 5}, {3, 5, 6}, {4, 7}, and {5, 6, 7}. A complete set is any subset of these sets. The graph above corresponds to a factorization as f(x) = ψ 12 (x 1, x 2 )ψ 13 (x 1, x 3 )ψ 24 (x 2, x 4 )ψ 25 (x 2, x 5 ) ψ 356 (x 3, x 5, x 6 )ψ 47 (x 4, x 7 )ψ 567 (x 5, x 6, x 7 ).

Factorisation of the multivariate Gaussian Consider a multivariate Gaussian random vector X = N V (ξ, Σ) with Σ regular so it has density f(x ξ, Σ) = (2π) V /2 (det K) 1/2 e (x ξ) K(x ξ)/2, where K = Σ 1 is the concentration matrix of the distribution. Thus the Gaussian density factorizes w.r.t. G if and only if α β = k αβ = 0 i.e. if the concentration matrix has zero entries for non-adjacent vertices.

Factorization theorem Consider a distribution with density f w.r.t. a product measure and let (G), (L) and (P) denote Markov properties w.r.t. the semigraphoid relation. It then holds that and further: (F) = (G) If f(x) > 0 for all x: (P) = (F). Thus in the case of positive density (but typically only then), all the properties coincide: (F) (G) (L) (P).