Equivalence of the random intersection graph and G(n, p)

Similar documents
Sharp threshold functions for random intersection graphs via a coupling method.

Finding Hamilton cycles in random intersection graphs

Subhypergraph counts in extremal and random hypergraphs and the fractional q-independence

Large cliques in sparse random intersection graphs

Katarzyna Mieczkowska

Matchings in hypergraphs of large minimum degree

SIZE-RAMSEY NUMBERS OF CYCLES VERSUS A PATH

Recent Progress in Complex Network Analysis. Properties of Random Intersection Graphs

Induced subgraphs of Ramsey graphs with many distinct degrees

Small subgraphs of random regular graphs

The expansion of random regular graphs

On decomposing graphs of large minimum degree into locally irregular subgraphs

The concentration of the chromatic number of random graphs

A zero-one law for the existence of triangles in random key graphs

ON THE NUMBER OF ALTERNATING PATHS IN BIPARTITE COMPLETE GRAPHS

A note on network reliability

Vertex colorings of graphs without short odd cycles

Coupling of Scale-Free and Classical Random Graphs

Lecture 7: February 6

Short cycles in random regular graphs

SHORT PATHS IN 3-UNIFORM QUASI-RANDOM HYPERGRAPHS. Joanna Polcyn. Department of Discrete Mathematics Adam Mickiewicz University

Packing and decomposition of graphs with trees

Assignment 2 : Probabilistic Methods

9.1 Branching Process

Edge-disjoint induced subgraphs with given minimum degree

Decomposing oriented graphs into transitive tournaments

Induced subgraphs with many repeated degrees

Large cliques in sparse random intersection graphs

Connectivity of random cubic sum graphs

How many randomly colored edges make a randomly colored dense graph rainbow hamiltonian or rainbow connected?

Asymptotically optimal induced universal graphs

The Turán number of sparse spanning graphs

Random Graphs III. Y. Kohayakawa (São Paulo) Chorin, 4 August 2006

< k 2n. 2 1 (n 2). + (1 p) s) N (n < 1

Embedding the Erdős-Rényi Hypergraph into the Random Regular Hypergraph and Hamiltonicity

Balls & Bins. Balls into Bins. Revisit Birthday Paradox. Load. SCONE Lab. Put m balls into n bins uniformly at random

Bipartite decomposition of random graphs

Subset sums modulo a prime

Asymptotically optimal induced universal graphs

To Catch a Falling Robber

ACO Comprehensive Exam October 14 and 15, 2013

On a hypergraph matching problem

The typical structure of sparse K r+1 -free graphs

Probabilistic Combinatorics. Jeong Han Kim

Avoider-Enforcer games played on edge disjoint hypergraphs

An alternative proof of the linearity of the size-ramsey number of paths. A. Dudek and P. Pralat

MARKING A BINARY TREE PROBABILISTIC ANALYSIS OF A RANDOMIZED ALGORITHM

1. Introduction Given k 2, a k-uniform hypergraph (in short, k-graph) consists of a vertex set V and an edge set E ( V

A prolific construction of strongly regular graphs with the n-e.c. property

RANDOM GRAPHS. Joel Spencer ICTP 12

Lecture 5: January 30

Lecture 4: Two-point Sampling, Coupon Collector s problem

6.1 Occupancy Problem

DIRAC-TYPE RESULTS FOR LOOSE HAMILTON CYCLES IN UNIFORM HYPERGRAPHS

Properly colored Hamilton cycles in edge colored complete graphs

The Lopsided Lovász Local Lemma

Decomposition of random graphs into complete bipartite graphs

arxiv: v1 [math.co] 28 Jan 2019

Large topological cliques in graphs without a 4-cycle

IMA Preprint Series # 2385

Colorings of uniform hypergraphs with large girth

Tail Inequalities. The Chernoff bound works for random variables that are a sum of indicator variables with the same distribution (Bernoulli trials).

A sequence of triangle-free pseudorandom graphs

Two-coloring random hypergraphs

Rao s degree sequence conjecture

New lower bounds for hypergraph Ramsey numbers

On the threshold for k-regular subgraphs of random graphs

Rainbow Connection of Random Regular Graphs

arxiv: v2 [math.co] 20 Jun 2018

Rainbow Hamilton cycles in uniform hypergraphs

Lecture 6. Today we shall use graph entropy to improve the obvious lower bound on good hash functions.

Rainbow Hamilton cycles in uniform hypergraphs

Irredundant Families of Subcubes

AALBORG UNIVERSITY. Total domination in partitioned graphs. Allan Frendrup, Preben Dahl Vestergaard and Anders Yeo

Minimal Paths and Cycles in Set Systems

arxiv: v1 [math.co] 1 Aug 2013

Loose Hamilton Cycles in Random k-uniform Hypergraphs

FIRST ORDER SENTENCES ON G(n, p), ZERO-ONE LAWS, ALMOST SURE AND COMPLETE THEORIES ON SPARSE RANDOM GRAPHS

On the size-ramsey numbers for hypergraphs. A. Dudek, S. La Fleur and D. Mubayi

Off-diagonal hypergraph Ramsey numbers

THE DELETION METHOD FOR UPPER TAIL ESTIMATES

On the number of cycles in a graph with restricted cycle lengths

Lectures 6: Degree Distributions and Concentration Inequalities

Multi-colouring the Mycielskian of Graphs

On a Conjecture of Thomassen

ON THE STRUCTURE OF ORIENTED GRAPHS AND DIGRAPHS WITH FORBIDDEN TOURNAMENTS OR CYCLES

Applications of the Lopsided Lovász Local Lemma Regarding Hypergraphs

arxiv: v2 [math.co] 29 Oct 2017

arxiv: v3 [math.co] 10 Mar 2018

Course Notes. Part IV. Probabilistic Combinatorics. Algorithms

Random Graphs. 7.1 Introduction

Randomized algorithm

Randomized Load Balancing:The Power of 2 Choices

Szemerédi s regularity lemma revisited. Lewis Memorial Lecture March 14, Terence Tao (UCLA)

Explosive Percolation in Erdős-Rényi-Like Random Graph Processes

Game saturation of intersecting families

Covering n-permutations with (n + 1)-Permutations

On the maximum running time in graph bootstrap percolation

arxiv: v1 [math.pr] 24 Feb 2010

Decomposition of random graphs into complete bipartite graphs

Transcription:

Equivalence of the random intersection graph and Gn, p arxiv:0910.5311v1 [math.co] 28 Oct 2009 Katarzyna Rybarczyk Faculty of Mathematics and Computer Science, Adam Mickiewicz University, 60 769 Poznań, Poland Abstract We solve the conjecture of Fill, Scheinerman and Singer-Cohen posed in [4] and show the equivalence of the sharp threshold functions of the random intersection graph G n,m,p with m n 3 and a graph Gn, ˆp with independent edges. Moreover we prove sharper equivalence results under some additional assumptions. keywords: random intersection graph, equivalence, graph properties 1 Introduction In an intersection graph there is a set of vertices V and an auxiliary set of objects W. Each vertex v V is assigned a subset of objects Wv W according to a given probability measure. Two vertices v 1, v 2 are adjacent in a random intersection graph if and only if Wv 1 Wv 2. A general model of the random intersection graph, in which each vertex is assigned a subset of objects Wv W chosen uniformly from all d element subsets, where the cardinality d is determined according to the arbitrarily given probability distribution, was introduced in [5]. We will concentrate on analyzing the properties of the random intersection graph in which the cardinality d is chosen according to the binomial distribution. Namely, we will investigate properties of the random intersection graph G n, m, p introduced in [7, 10]. G n, m, p is a graph with number of vertices V = n, number of objects W = m, in which each feature w is added to Wv with probability p independently for all v V and w W i.e. Pr {w Wv} = p. However, to some extent, the results obtained may be generalized to other random intersection graph models due to equivalence theorems proved in Section 4 in [2]. While introducing a new random graph model it is worth asking how it differs from those already studied. Therefore one of the first papers on the topic of the random intersection graphs was the one by Scheinerman, Fill and Singer Cohen [4]. The aim of the paper was to compare G n, m, p and random graph Gn, ˆp in which each edge appears independently. The value of ˆp was set to be approximately Pr {v 1, v 2 EG n, m, p}. One of the conclusions of the article was that the differences between Gn, ˆp and G n, m, p are caused 1

by the dependency of edge appearance in the latter one and that the larger the m the less relevant the dependency. The main theorem in [4] states that for α > 6 the graphs Gn, ˆp and G n, m, p have asymptotically the same properties. Moreover, it is pointed out that the theorem may be extended to smaller values of α if we make additional assumptions about p. The proof is based on the fact that for large α and relevant values of p, with probability tending to one as n, there are no properties assigned to more than two vertices and therefore the dependency between edges is asymptotically negligible. The authors of [4] suggest that the equivalence theorem is true for all properties for 3 α 6, i.e. in the case where the number of vertices assigned to each property is still small. The above mentioned result and conjecture are consistent with the simple observation that the number of vertices to which a given property w is assigned has essential impact on the dependency between edge appearance in G n, m, p. The edge set of the random intersection graph G n, m, p is a union of cliques with the vertex sets V w := {v V : w Wv}, w W, and we may divide the set of edges of G n, m, p according to the size of the clique in which it is contained. Let k 2. We will denote by G k n, m, p a graph with a vertex set V and an edge set {v 1, v 2 : w v 1, v 2 V w and V w = k}. Alternatively we may define G k n, m, p = GH k n, m, p, where H k n, m, p is a hypergraph with a vertex set V and an edge set {v 1, v 2,...,v k : w V w = {v 1, v 2,...,v k }} and for a hypergraph H a graph GH is a graph with the same vertex set as the hypergraph H and an edge set consisting of those pairs of vertices which are contained in at least one edge of H. Under this notation EG n, m, p = m k=2 EGH k n, m, p = m k=2 EG k n, m, p. In [4] it is shown that for some m and p graphs G n, m, p, G 2 n, m, p, Gn, ˆp are asymptotically almost the same. To be precise G k n, m, p are empty for k 3 with probability tending to one as n we will say with high probability and the edges in G 2 n, m, p are almost independent. The authors in [4] support the conjecture for 3 α 6 by results concerning threshold functions for some properties of G n, m, p. However, it should be pointed out that if there exists C > 0 such that 1 p C 1/n 3 m, then the expected number of edges in G 3 n, m, p tends to a constant or even to infinity. Therefore we may expect that the structure of G n, m, p and G n, ˆp differs. Namely, though the number of triangles in G 2 n, m, p may make dominating contribution, the impact of triangles contained in G 3 n, m, p on the structure of the random intersection graph cannot be omitted. As an example we may state the fact that for α = 3 the number of triangles in G n, m, p and Gn, ˆp on the threshold of appearance i.e. for p = c/n 2 and ˆp mp 2 = c 2 /n has Poisson distribution with parameters c 3 +c 6 /3! and c 6 /3!, respectively see [11]. For larger values of α the expected number of triangles in G n, m, p and Gn, ˆp may also differ significantly. The same is true for cliques of size four contained in G 4 n, m, p. In fact G k n, m, p should be rather compared with GH k n, ˆp k, where ˆp k is approximately the probability that for given {v 1,...,v k } V there exists w such that V w = {v 1,...,v k }, H k n, ˆp k is a k-uniform random hypergraph with each edge appearing independently with probability ˆp k and GH k n, ˆp k is defined as above. The above observation leads us to the conclusion that the equivalence theorem may not be stated for 3 α 6 in such a general 2

form as it was for α > 6. Therefore we will draw our attention to the case of monotone properties. The concept of restriction of the equivalence theorems to the class of monotone properties has already been developed while examining the equivalence of Gn, ˆp and Gn, M see [3, 6, 9]. The article is organized as follows. In Section 2 we state and discuss the results. In Section 4 we outline the proof of the main theorems. In Sections 5 and 6 we prove results needed to complete the proof. Throughout the article all limits are taken as n. We will also use standard Landaus notation O, Θ, Ω, o, see for example [6] and we will use the phrase with high probability to say with probability tending to one as n. 2 Result In our considerations we will draw our attention to G n, m, p for 1 ln n 2 Ω n 3 = p = O. m m For the values of p significantly larger than a complete graph on n vertices see [4, 10]. And if 1 3 p = o n 3 m ln n the graph G n, m, p is with high probability m then with high probability G k n, m, p are empty for all k 3. Therefore a slight modification of the proof from [4] gives us the result that G n, m, p and Gn, ˆp are asymptotically equivalent for all possible properties. In fact we may state the following equivalence theorem., Theorem 1. Let a [0; 1], A be any graph property, p = o ˆp = 1 exp mp 2 1 p n 2. 1 n 3 m and Then if and only if Pr {G n, ˆp A} a Pr {G n, m, p A} a. For the proof see [4] and Lemma 6. The result will concern monotone properties. The most important properties such as connectivity, having the largest component of size at least k, containment of the perfect matching or containment of a given graph as a subgraph are included in the wide family of the monotone properties. Let G be a family of graphs with a vertex set V. We will call A G an increasing decreasing property if A is closed under isomorphism and G A implies G A for all G such that EG EG EG EG. 3

Theorem 2. Let a [0; 1], m = n α for α 3 and A be any monotone property. i Let 1 Ω n 3 = p = O m ln n m and 1/n 3 m = op for α = 3. If Pr { G n, 1 exp mp 2 1 p n 2 A } a and for all ε = εn 0 Pr { G n, 1 + ε1 exp mp 2 1 p n 2 A } a, then Pr {G n, m, p A} a. ii Let ˆp = ˆp n [0; 1 be a sequence bounded away from one by a constant. If for all ε = εn 0 Pr G n, m, ln1 ˆp 1+ε A m a and then { } ln1 ˆp Pr G n, m, A a, 1 εm Pr {G n, ˆp A} a. In i we have to exclude the case p = Θ1/n m, since the thesis is not true on the threshold of the triangle appearance see [11]. The method of the proof is strong enough to show sharper results in many cases. For example, for α > 3 the function εn may be replaced by 1/n δ, where δ is a constant depending on α. We state here two theorems as the example of how tight the results may be, if we make some additional assumptions. Theorem 3. Let a [0; 1], A be any monotone property, m = n α for α > 4 and 1 ln n Ω n 3 = p = O. m m Let p = 1 exp mp 2 1 p n 2 ; p + = 1 exp mp 2 1 p n 2 + 10 3 mp 3. 4

If then Pr{G n, p A} a and Pr{G n, p + A} a, Pr{G n, m, p A} a. Theorem 4. Let a [0; 1], c 3 = 30, c 4 = 157, A be any monotone property, m = n α for α > 10/3. Let p = 1 exp mp 2 1 p n 2 ; 1 exp mp 2 1 p n 2 + c 3 3 mp 3, for Ω n 1 m 1/3 = p = o n 1 m 1/4 ; p + = 1 exp mp 2 1 p n 2 + c 3 3 mp 3 + c 6 4 mp 4, for Ω n 1 m 1/4 = p = O m 1/2 ln 1/2 n. If then Pr{G n, p A} a and Pr{G n, p + A} a, Pr{G n, m, p A} a. 3 Auxiliary definitions, inequalities and facts 3.1 Coupling We will frequently use a coupling argument. Let < P, > be a countable partially ordered set. In our case P may be a subset of N with relation, a Cartesian product N t with relation x 1,...,x t y 1,...,y t 1 i t x i y i or a set of hypergraphs G on a given set of vertices with relation of being a subhypergraph. In the article the set G will be either the set of all graphs or hypergraphs on n vertices or the set of k partite graphs or hypergraphs with partitions with n vertices. To omit unnecessary formalities we will not say directly which partially ordered set we are considering, when it is obvious from the context. Let X and Y be two random variables with values in P. We will write X q Y, if there exists a coupling X, Y of the random variables such that X Y with probability q i.e. if there exists a probability space Ω and two random variables X and Y, such that X and Y are both defined on Ω, have probability distribution as X and Y, respectively, and X Y with probability q. We will use the fact that such coupling exists if and only if there exists a probability measure µ : P P [0; 1] such that for any set A P we have µa P = Pr{X A} and µp A = Pr{Y A} and µ{x, y P P; x y} = q. Now we will state two useful facts. The simple proofs we add for completeness of considerations. 5

Fact 1. Let P be a countable partially ordered set and X and Y be random variables with values in P. If 4 X 1 q1 Y and Y 1 q2 Z, then for some q q 1 + q 2 X 1 q Z. Proof. Let µ 1, µ 2 : P P [0; 1] be probability measures associated with couplings existing by 4. Let P = {y P : Pr {Y = y} 0}. Define µ 3 : P P P [0; 1], µ 3 x, y, z = µ 1x, yµ 2 y, z Pr {Y = y} ; µ : P P [0; 1], µx, z = µ 3 {x} P {z}. Then for A 1 = {x, y, z : x y} and A 2 = {x, y, z : y z} we have µ{x, z : x z} = µ 3 {x, y, z : x z} µ 3 A 1 A 2 µ 3 A 1 + µ 3 A 2 1 = 1 q 1 + q 2. Fact 2. If X 1,..., X t and Y 1,...,Y t are vectors of independent random variables and 5 X i qi Y i, for all 1 i t, then X 1,...,X t q Y 1,...,Y t and where q, q k i=1 q i. t X i q i=1 t Y i, i=1 Proof. For all 1 i t, let µ i : P P [0; 1] be a probability measure associated with the coupling existing by X i qi Y i. Simple calculation shows that µ : P t P t [0; 1] such that µx 1,...,x t, y 1,..., y t = t µ i x i, y i i=1 implies the thesis. 6

3.2 Total variation distance Let X and Y be the random variables with values in the same countable set P. We define the total variation distance between X and Y by d TV X, Y = max A P Pr{X A} Pr{Y A} = 1 2 Pr{X = x} Pr{Y = x}. Now let P = G be the set of hypergraphs graphs with a given vertex set. Let G 1 and G 2 be two random variables with values in G. Since x P d TV G 1, G 2 = 1 Pr{G 1 = G} Pr{G 2 = G}, 2 G G it is simple to construct the probability measure µ on G G such that µ{g, G : G G} = 1 2d TV G 1, G 2 with marginal distributions as distributions of G 1 and G 2. This implies: Fact 3. where q, q 1 2d TV G 1, G 2. G 1 q G 2 and G 2 q G 1, In our considerations we will also use the following facts concerning total variation distance, which are Facts 3 and 4 in [4]. Fact 4. Let A and A be random variables with values in the same set. If there exist random variables B and B such that for all possible b the distribution of A under condition B = b and the distribution of A under condition B = b are the same, then d TV A, A 2d TV B, B. Fact 5. Let A and A be two random variables. If there exists a probability space on which random variables B and B are both defined and have probability distribution as A and A, respectively, then d TV A, A Pr{B B }. We will also use a standard result see for example [1] equation 1.23. Fact 6. Let A be a random variable with binomial distribution Binˆn, ˆp and let A be a random variable with Poisson distribution Poˆnˆp. Then 3.3 Coupon collector model d TV A, A ˆp. We will define two auxiliary random variables, which are generalized versions of random variables defined in [4]. Let K 2 be a given constant integer, M be any random variable with values in N, n = n 2,...,n K be a vector of positive integers and P = P 2,...,P K be the vector of nonnegative reals such that K k=2 n kp k 1. Assume now that we have 7

K k=2 n k coupons K k=2 {ck 1,..., c k n k } and one blank coupon d 0. We make M independent draws, with replacement, such that in each draw In this scheme we define R k i a coupon c k i was chosen and Pr{c k i is chosen} = P k, for 2 k K, 1 i n k ; K Pr{d 0 is chosen} = 1 n k P k. k=2 M to be a random variable denoting the number of times that X k i M = { 1 if R k i M 1; 0 otherwise. The first auxiliary random variable is 6 XM = Xn, P, M = X 2 M,..., X K M, where The second random variable is n k X k M = X k i M. i=1 7 Y = Y n, P = Y 2,..., Y K where P = P 2,..., P K is a vector such that P k 1 for all 2 k K and Y k, 2 k K, are independent random variables with binomial distribution Binn k, P k. The simple observation stated below is a generalization of the part of the proof of Claim 1 in [4] and may be shown by careful calculation. Fact 7. Let M be a random variable with Poisson distribution Poλ, then R k i M, 2 k K and 1 i n k, are independent random variables with Poisson distribution PoλP k. Moreover X k M, 2 k K, are independent random variables with binomial distribution Binn k, 1 exp λp k. Therefore XM and Y have the same distribution for P k = 1 exp λp k. It is also simple to show the following fact. Fact 8. Let M and M be random variables with values in N. If M 1 o1 M, then XM 1 o1 XM. 8

3.4 Chernoff bound We will use the Chernoff bound see Theorem 2.1 in [6]. Lemma 1. Let X be a random variable with binomial distribution and λ = EX. Let a λ, then Pr {X a} exp λ a ln a λ + a After careful calculation we obtain the following lemma. Lemma 2. Let t 1 be an integer and X n be a sequence of random variables with binomial distribution, such that EX n = λ n. Let ε > 0 and ωn be any function tending to infinity. If t + ε ln n/lnln n ln λ n, for λ n = oln n; 8 a n = a n λ n, t, ε = ωnλ n, for λ n = Θln n; 1 + ελ n, for ln n = oλ n, then Pr {X n a n } = o n t. Lemma 3. Let X n be a sequence of random variables with binomial distribution. Then Pr {X n EX n t n } exp t2 n, for t n 0; 2EX n 9 3t 2 n Pr {X n EX n + t n } exp, for t n 0. 23EX n + t n It is also possible to formulate the version of the Chernoff bound for random variables with Poisson distribution. Lemma 4. Let X n be a sequence of random variables with Poisson distribution Poλ and i > 0 be any constant, then Pr {X n EX n t n } exp t2 n 1 + o, for t 2EX n n i n 0; 10 3t 2 n 1 Pr {X n EX n + t n } exp + o, for t 23EX n + t n n i n 0. Proof. It follows by 9 applied to random variable with binomial distribution Binλn i+1, 1/n i+1, definition of the total variation distance and Fact 6. 4 Outline of the proof of theorems Let p and p + be such that 11 G n, p 1 o1 G n, m, p and G n, m, p 1 o1 G n, p +. 9

Thus we may assume that there exists a probability space on which G n, p and G n, m, p G n, m, p and G n, p + are defined and for Pr {E } = 1 o1 Pr {E + } = 1 o1, E := {G n, ˆp G n, m, p} E + := {G n, m, p G n, ˆp + }. Let a [0, 1] and A be an increasing property. If Pr{G n, ˆp A} a and Pr{G n, ˆp + A} a, then under the couplings implied by 11 and Pr{G n, m, p A} Pr{G n, m, p A E + } Pr{E + } + Pr{E c +} Pr{G n, p + A E + } Pr{E + } + Pr{E c +} Pr{{G n, p + A} E + } + Pr{E c + } Pr{G n, p + A} + Pr{E c + } = = Pr{G n, p + A} + o1 = a + o1 Pr{G n, m, p A} Pr{G n, m, p A E } Pr{E } Pr{G n, p A E } Pr{E } = = Pr{{G n, p A} E } = Pr{G n, p A} + Pr{E } Pr {{G n, p A} E } Pr{G n, p A} + Pr{E } 1 = = Pr{G n, p A} + o1 = a + o1. Analogous equalities may be formulated for the decreasing properties or if we replace 11 by 12 G n, m, ˆp 1 o1 G n, ˆp 1 o1 G n, m, ˆp +. Therefore the main aim of the proof is to show that under the assumptions of the theorems couplings 11 and 12 exist. The existence of the couplings is implied by the following lemma. Lemma 5. Let 6 for nq 2 = On 1/2 ; 3 ln n/ ln nq 2 + ln ln n for nq 2 = o1 and onq 2 = n 1/2 ; 3 ln n/ lnln n for nq 2 = Θ1; 13 a n q = 3 ln n/lnln n 3 ln nq 2 for nq 2 and nq 2 = o 3 ln n; ωnn 3 q 6, where ωn for nq 2 and nq 2 = Θ 3 ln n cn 3 q 6, where c > 1 for nq 2 and onq 2 = 3 ln n; 10

Moreover let c 3 > 2 3/ 3 3!, c 4 > 3 15 3 4/ 6 4!, c 5 > 6 2 2 3 5 3 4 5/ 10 5!, q k = 1 exp mp k 1 p n k 1/k 2, for k = 2, 3, 4, 5 and p = q 2 q 2 + a n c 3 q 3 c 3 q 3, for p = Ωn 1 m 1/3 and p = omin{n 1 m 1/4, n 3/7 m 1/3 }; q 2 + 4 k=3 p + = a nc k q k c k q k, for p = Ωn 1 m 1/4 and p = omin{n 1 m 1/5, n 3/7 m 1/3, n 9/14 m 1/4 }; q 2 + 5 k=3 a nc k q k c k q k, for p = Ωn 1 m 1/5 and p = omin{n 1 m 1/6, n 3/7 m 1/3, n 9/14 m 1/4, n 6/7 m 1/5 }. Then G n, p 1 o1 G n, m, p and G n, m, p 1 o1 G n, p +. Now we will prove Theorems 2, 3 and 4 using Lemma 5. Proof of Theorems 2, 3 and 4. It is easy to check that, under the assumptions of Theorem 2i about p, we have p + = 1 + o1q 2. Therefore, for some ε = ε n 0 we have p + = 1 + ε 1 exp mp 2 1 p m 2. Thus by Lemma 5 G n, q 2 1 o1 G n, m, p 1 o1 G n, 1 + ε q 2, which implies Theorem 2i. Analogously in the proof of Theorem 2ii we use the fact that, either ˆp is such that we may use Theorem 1 or for some ε n 0 we get G n, m, ln1 ˆp 1+ε 1 o1 G n, ˆp m and ln1 ˆp G n, ˆp 1 o1 G n, m,. 1 ε m This implies Theorem 2ii. The proofs of Theorems 4 and 3 are basically the same. We only have to calculate the value of ac k q k c k and notice that q k k 2 mp k for k = 3, 4. Therefore we need to show Lemma 5 to complete the proof. Outline of the proof of Lemma 5. In the statement of the lemma we have 3 different values of p +. They correspond to three cases. In the first case, since p = on 1 m 1/4, the hypergraph H 4 n, m, p is empty with high probability. In the second case H 5 n, m, p is empty and in the third case H 6 n, m, p. We will prove Lemma 5 in all three cases at the same time. 11

The proof will differ only by the value of K, which is 3, 4 and 5 in the first, second and third case, respectively. Let m = n α and q k = 1 exp mp k 1 p n k 1/k 2. We will prove that under assumptions of Lemma 5 there exists a sequence of couplings 14 15 16 17 18 G n, q 2 1 o1 G 2 n, m, p 1 G n, m, p K 1 o1 G k n, m, p k=2 1 o1 K k=2 GH k n, q 2 k k K 1 o1 G n, q 2 G n, a n c k q k c k q k 1 G n, q 2 + k=2 K a n c k q k c k q k. k=3 Here GH 2 n, q 2 2 2,..., GH K n, q 2 K K are independent random hypergraphs, c 3, c 4, c 5 are constants given in the statement of the lemma and a n is a function given in the statement of the lemma. The left hand side coupling of 15 is trivial and the coupling 18 is a standard one. The right hand side coupling in 15 follows by the fact that under the assumptions of Lemma 5 Pr{ w W V v > K} = O mn K+1 p K+1 = o1. Proof of the existence of the remaining couplings will require more effort. The couplings 14 and 16 follow by Lemma 6 and Fact 3. Lemma 6. Let K 2 be a constant integer and p = o1/n, then K K d TV H k n, m, p, H k n, 1 exp mp k 1 p n k = o1, k=2 k=2 where H k n, 1 exp mp k 1 p n k are independent random hypergraphs. Lemma 6 will be shown in Section 5. The following lemmas imply existence of the coupling 17. Lemma 7. Let c 3 > 2 3/ 3 3!, q = Ωn 1 and q = on 3/7. where a n q is defined as in 13. GH 3 n, q 3 1 o1 G n, a n c 3 qc 3 q, 12

Lemma 8. Let c 4 > 3 15 3 4/ 6 4!, q = Ωn 2/3 and q = on 3/7. where a n q is defined as in 13. GH 4 n, q 6 1 o1 G n, a n c 4 qc 4 q, Lemma 9. Let c 5 > 6 2 2 3 5 3 4 5/ 10 5!, q = Ωn 1/2 and q = on 3/7. where a n q is defined as in 13. GH 5 n, q 10 1 o1 G n, a n c 5 qc 5 q, The above lemmas will be shown in Section 6. The coupling 17 is a consequence of the above lemmas after substituting q = q k for k = 3,...,K. Notice that although the expected number of hyperedges in GH k n, q k 2 and cliques in G n, q is the same, the function a n is necessary. We will construct a coupling of two random graph models, the existence of which contradicts the thesis that there exists a constant C such that for all q GH 3 n, q 3 1 o1 G n, Cq. Let q = o1. For any e, a 3 element subset of V, define F e to be the set of bijections assigning to the numbers from the set {1, 2, 3} the vertices of e F e = 6. Now, to each e, a 3-element subset of V, and each function f F e we assign f to e independently of all other functions and sets with probability r = 1 1 q 3 1/6 q3 6. Notice that if we add each edge e to the set of edges of the hypergraph with the vertex set V in the case when at least one function from F e is assigned to e, we will get a random variable with the same distribution as H 3 n, q 3. Moreover we may construct a random subgraph G 3 of GH 3 n, q 3 by adding an edge v 1, v 2, v 1, v 2 V, if and only if at least one 3-element subset of V containing v 1 and v 2 is assigned a function in which v 1 and v 2 are assigned 1 and 2 or 2 and 1. Notice that, from independent choice of the functions from F e we get that each edge appears in G 3 independently with probability r = 1 1 r 2n 2 2nr 1 3 nq3. Therefore G n, r 1 GH 3 n, q 3 and in the lemmas there should be a n = Ωnq 2. 13

5 Proof of Lemma 6 Let, for all 2 k K, p k = p k 1 p n k, n k = n, P k = k p k K k=2 p kn k and P k = 1 exp mp k, M have binomial distribution Binm, P P = K k=2 p kn k, XM be defined as in 6 and Y be defined as in 7. Then for all 2 k K EH k n, m, p = X k M and EH k n, 1 exp mp k = Y k. Moreover for any two hypergraphs H and H, such that for all k 2 the number of edges of cardinality k in H and H is the same, we have { K } { K } Pr H k n, m, p = H = Pr H k n, m, p = H and k=2 { K } { K } Pr H k n, 1 exp mp k = H = Pr H k n, 1 exp mp k = H. k=2 k=2 k=2 Therefore, by Fact 4 the following lemma implies Lemma 6. Lemma 10. Let K 2 be a constant integer. Let p = o1/n, M be a random variable with binomial distribution Binm, K n k=2 k pk, p k = p k 1 p n k, XM be defined as in 6 for n k = n k and Pk = p k / K k=2 p kn k. Let moreover Y be defined as in 7 for P k = 1 exp mp k. Then d TV XM, Y = o1. The proof of the above lemma uses the main idea of the proof of Claim 1 in [4]. However, a slight modification of the choice of M and M enables us to extend the result for α 4. Proof. We replace the binomial random variable M with the Poisson random variable M with the same expected value m K n k=2 pk. By Fact 7 the random variables X j k i M are independent and Pr{X k i M = 1} = 1 exp mp k. Therefore in XM = X 2 M,...,X K M, the random variables X 2 M,..., X K M are independent with binomial distribution Binn 2, 1 exp mp 2,...,Binn K, 1 exp mp K, respectively. By definition of Y, Facts 4 and 6 we have d TV Y, XM = d TV XM, XM 2d TV M, M K 2 k=2 n k p k = O K n k p k = o1. k=2 14

6 Proof of Lemma 7, 8 and 9 In fact, we will show the version of the lemmas in the case of random k partite hypergraphs and graphs. The following fact shows that we may reduce the problem to the k partite case. First let us introduce additional notation. Let X 1,...,X k be disjoint n-element sets and r [0; 1]. We define H k n, r to be the hypergraph with a vertex set k i=1 X k and an edge set being the random subset of E := {x 1,...,x k : 1 i k x i X i } such that each element from E is added to EH k n, r independently with probability r. Let moreover G k n, r be the random k partite graph with k-partition X 1,..., X k and each edge appearing with probability r. It is simple to prove the following fact. Fact 9. Let a n = Ω1. If 19 GH k n, r k 2 1 o1 G k n, a n r, then, GH k n, 1 1 r 2 k k! 1 o1 G n, 1 1 a n r kk 1 and if a n r = o1, then for any constant c > kk 1 GH k n, k!r k 2 1 o1 G n, c a n r. Proof. Let X i = {x i 1,...,x n i }, for 1 i k, and V = {v 1,...,v n }. For a given instance of H k n, r 2 k or G k n, a n r we may get an instance of a hypergraph H k n, 1 1 r 2 k or a graph G n, 1 1 a n r kk 1 with the vertex set V by merging for all 1 j n all the vertices x i j, 1 i k, into v j and deleting the edges with less then k or 2 vertices. In the remaining part of the section we will prove the following lemmas. Lemma 11. Let q = Ωn 1 and q = on 1/3. where a n q is defined as in 13. GH 3 n, q 3 1 o1 G 3 n, a n qq, The above lemma is a generalization of Theorem 1.7 from [8], where it was shown in the case ln n/n 2 1/3 q = on 3/5 and a n = 17. Moreover we will prove the analogue of the lemma for the case of 4-partite and 5-partite hypergraphs. Lemma 12. Let q = Ωn 2/3, c 4 > 3 15 and q = on 2/5. where a n q is defined as in 13. GH 4 n, q 6 1 o1 G 4 n, a n c 4 qc 4 q, Lemma 13. Let q = Ωn 1/2, c 5 > 6 2 2 3 5 3 and q = on 2/5. where a n q is defined as in 13. GH 5 n, q 10 1 o1 G 5 n, a n c 5qc 5q, 15

We will prove in detail Lemma 11. The proof of the remaining lemmas is analogous, therefore we will only sketch them. Proof of Lemma 11. First we will divide the hypergraph H 3 n, q 3 into n, independent, edge disjoint hypergraphs Hx, x X k, such that H 3 n, q 3 = x X 3 Hx. For any x X 3, Hx will be a hypergraph with the vertex set {x} X 1 X 2 and an edge set consisting of those edges from EH 3 n, q 3, which contain x. Next, for each x separately, we will compare GHx with an auxiliary graph Tx. By definition, Tx will be a graph with the same vertex set as Hx and with an edge set constructed by the following procedure. First we add each edge x, y, y X 1 X 2 independently with probability Cq to the edge set, where c, where c > 5, for nq 2 = o1; 20 C = Cq = ωn, where ωn, for nq 2 = Θn; cnq 2, where c > 1, for nq 2. We assume, that ωn tens slowly to infinity and c is close to 5 and 1, respectively Then independently with probability q we add to the edge set each edge x 1, x 2 X1 X2, where, for each 1 i 2, Xi is the set of vertices form X i connected by an edge with x. We will prove that 21 GHx 1 o1/n Tx. By definition of Tx, 21 is equivalent to 22 H x 1 o1/n T x, where H x is a graph with the vertex set X 1 X 2 and an edge set {x 1, x 2 : x 1, x 2, x Hx} and T x is a subgraph of Tx induced on X 1 X 2. Then we will show, that 23 Tx 1 o1 G 3 n, a n qq. x X 3 To get 23 we will show that 24 T x 1 o1 H 2 n, a n qq x X 3 and H 2 n, a n qq is independent of the choice of X i. This by definition of a n q, Tx and T x is equivalent to 23. Therefore the main parts of the proof will be to show 22 and 24 Proof of 22 We will divide the proof into four cases 16

1. q = Olnn/n, 2. ln n/n q and q = On 2 ln n 1/3, 3. n 2 ln n 1/3 q and q = on 1/2, 4. q = Ωn 1/2 and q = on 1/3, CASE 1 In this case we will use the fact that with probability 1 + o1/n the graph H x consists of at most one edge. Namely the probability that the graph H x has more than one edge is at most n 2 q 6 = O n 4 q 6 1 = o 2 n Moreover, for large n, Pr { x1 X 1,x 2 X 2 x 1, x 2 ET x} This gives an obvious coupling Pr {x 1, x 2 ET x} x 1 X 1,x 2 X 2 Pr {x 1, x 2, x 1, x 2 ET x} x 1,x 1 X 1,x 2,x 2 X 2 = n 2 Cq 2 q n 2 2 Cq 4 q 2 2n = C 2 n 2 q 3 1 On 2 q 3 + nq 2 n 2 q 3 Pr { x1 X 1,x 2 X 2 x 1, x 2 EH x}. H x 1 o1/n T x. n Cq 3 q 2 = 2 CASES 2, 3 and 4 In the three latter cases we will use the fact that the number of the vertices in Xi is sharply concentrated around its expected value. Let r [0; 1], we denote by H 2 C nq, r a 2 partite graph with 2 partition X 1, X 2 and with an edge set constructed by the following procedure. Let 5, for nq 2 = o1; C = ω n, for nq 2 = Θ1; c nq 2, where c > 1 for nq 2, be such that C/C > 1 is a constant. First, for each 1 i 2 independently, we choose X i uniformly at random from all C nq element subsets of X i and then we add each edge x 1, x 2 X 1 X 2 to the edge set independently with probability r. By Chernoff bound 9 H 2 C nq, q 1 o1/n T x. 17

Therefore, since in H x each edge appears independently, in order to show 22 we will show that in all three cases H 2 n, q 3 1 o1/n H 2 C nq, q 3. CASE 2 In this case we will use the fact that H 2 n, q 3 i.e. also H x with high probability does not contain many edges except the maximum matching. First we will show auxiliary lemmas. Lemma 14. Let r = o1/c nq, ln n = onq and N 2 r be the random variable denoting the size of the maximum matching in H 2 C nq, r, then 25 N 2 r 1 o1/n N 2 r, where N 2 r has binomial distribution BinC nq, s 2 r and s 2 r C nqr. Proof of Lemma 14. Let H be a hypergraph chosen according to the probability distribution of H 2 C nq, r. Define H to be the hypergraph with a vertex set X 1 and an edge set {x 1 : x 1 X 1 and x2 X 2 x 1, x 2 EH}. Notice that H is chosen according to the probability distribution of H 1 C nq, 1 1 r C nq H 1 C nq, 1 1 r C nq is a hypergraph with a vertex set X 1 and the edge set constructed by first choosing X 1 uniformly at random from all C nq element subsets of X 1 and then adding to the edge set each x 1 X 1 independently with probability 1 1 r C nq. Let H be the subhypergraph of H such that for each edge x 1 EH we pick uniformly at random an edge from EH containing x 1 and add it to the edge set of H. Notice that the maximum matching in H is at least of the size of the set of non isolated vertices in X 2 in H. Moreover the edge set of H may be alternatively constructed in the following way i.e. this construction will give us the same probability distribution. First we pick an integer according to the binomial distribution BinC nq, 1 1 r C nq, then, given the value of the picked integer, we pick a subset X 1 uniformly at random from all subsets of X 1 of this cardinality. Independently we choose X 2 uniformly at random from all C nq element subsets of X 2. Then to each vertex x 1 X 1, to create an edge, we add one vertex, chosen uniformly at random from the set X 2. For all x 1 X 1 the choices of the second vertex are independent with repetition. Therefore by the above construction, 10 and Fact 8 XM 1 o1/n XC nq 1 N 2, where XM and XC nq are defined as in 6 for K = 2, n 2 = C nq, P 2 = 1 1 r C nq /C nq and M with Poisson distribution PoC nq 3C nq lnn. Thus by Fact 7 XM has binomial distribution BinC nq, s 2 r, where s 2 r = 1 exp C nq 3C nq ln n1 1 r C nq /C nq = = 1 exp 1 + o1c nqr C nqr. 18

Using the above lemma we will show the existence of the coupling between the random variable M 2 denoting the size of an edge set in H 2 n, q 3 and N 2. Lemma 15. Let C = 5, M 2 have binomial distribution Binn 2, q 3 and let N 2 be the size of the maximum matching in H 2 C nq, q. Then M 2 1 o1/n N 2. Proof. By previous lemma and Fact 1 it is sufficient to show 26 M 2 1 o1/n N 2, where N 2 has binomial distribution BinC nq, s 2 q for some s 2 q C nq 2. Notice that nq n M 2 = ξ i, where ξ i are independent with distribution Bin q, q3 ; i=1 nq N 2 = ζ i, where ζ i are independent with distribution Bin C, s 2 q. i=1 Since s 2 q C nq 2, for large n we have and 1 l 4 Pr{ξ i = l} 1 l! nq2 l C l s l 2 l! 1 s 2 C l = Pr{ζ i = l} n q Pr{ξ i > 4} q 5 3 nq 2 5 1 7 = nq 3 1 1 5 n 2 2 q 2 = o q n 2 q Therefore, for all 1 i nq it is simple to construct the probability measure on N N, which existence implies ξ i 1 o1/n 2 p ζ i. This by Fact 2 implies 26. Since we will compare the sizes of the maximum matchings in H 2 n, q 3 and H 2 C nq, q, we will introduce the additional notation. Let G be a set of 2 partite graphs with 2 partition X 1, X 2. We define Ml - the subset of G containing all graphs with a maximum matching of cardinality l; M 1 l - the subset of Ml containing all graphs with the maximum degree 1; M 2 l - the subset of Ml containing all graphs with the maximum degree 2 and exactly one vertex of degree 2 and n n M 1 = M 1 l, M 2 = M 2 l. l=0 19 l=0

For q = on 2/3 and for q = O n 2 ln n 1/3 Now let Pr { H 2 n, q 3 / M 1 } 2n n 2 q 6 = O n 4 q 6 n 1 = o 1 n Pr{H 2 n, q 3 / M 1 M 2 } 2 2 n n n 1 2 q 6 + n 2 q 6 + n 2 q 3 n 1 q 3 2 n + 2n q 9 = 2 2 2 3 = O n 6 q 12 + n 4 q 9 n = O 2 q 34 n 2 + 1 n 2 q 33 n 2 = o n µ : N N [0, 1] be the probability measure associated with the coupling of M 2 and N 2 from Lemma 15. Let G be the set of all 2 partite graphs with 2-partition X 1, X 2. Starting with the probability measure µ we will construct the coupling, which implies for large n H 2 n, q 3 1 o1/n H 2 C nq, q. We introduce additional notation. Let H 2 M 1 be the random graph constructed in the following way. First we sample H according to the probability distribution of H 2 n, q 3 and if H is not in M 1, then we replace it by a graph chosen uniformly at random from M 1 EH. Moreover let H 2 M 1 M 2 be the random graph constructed by sampling H according to the probability distribution of H 2 n, q 3 and replacing it by a graph chosen uniformly at random from M 1 EH in the case if it is not contained in M 1 M 2. The sizes of the edge sets of H 2 M 1 and H 2 M 1 M 2 are random variables M 2 M 1 and M 2 M 1 M 2, respectively. Obviously M 2 M 1 and M 2 M 1 M 2 have the same distribution as M 2. For any event A, denote by H 2 M 1 [A], H 2 M 1 M 2 [A] and H 2 C nq, q [A] the graphs H 2 M 1, H 2 M 1 M 2 and H 2 C nq, q under condition A. In the case q = on 2/3 the construction of the coupling is simple. By the above calculation H 2 n, q 3 1 o1/n H 2 M 1, Therefore we will only show H 2 M 1 1 o1/n H 2 C nq, q. Let l 1, l 2 N N be chosen according to the probability measure µ. If l 1 > l 2, then we sample H 2 M 1 [M 2=l 1 ] and H 2 C nq, q [N 2=l 2 ] independently. And if l 1 l 2, then first we sample an instance of H 2 C nq, q [N 2=l 2 ] and then choose its subgraph uniformly at random from all its subgraphs contained in M 1 l 2. Then, from the chosen subgraph, we delete l 2 l 1 edges chosen uniformly at random. In this way we get an edge set of H 2 M 1 [M 2=l 1 ]. 20

Now we will construct a coupling in the case q = Ωn 2/3 and q = O n 2 ln n 1/3. Let P 1 l = Pr{H 2 M 1 M 2 [M 2=l] M 1 }; P 2 l = Pr{H 2 C nq, q [N 2=l] M 1 }; Q 1 l = Pr{H 2 M 1 M 2 [M 2=l] / M 1 }; Q 2 l = Pr{H 2 C nq, q [N 2=l] / M 1 }. We will show that H 2 M 1 M 2 1 o1/n H 2 C nq, q. Let l 1, l 2 N N be chosen according to the probability measure µ. If l 1 > l 2 or l 2 ωn lnn where ωn is a sequence tending slowly to infinity, then we construct a pair of graphs from G by sampling independently H 2 M 1 M 2 [M 2=l 1 ] and H 2 C nq, q [N 2=l 2 ]. If 1 l 1 l 2 < ωn lnn, then we sample H, a second graph in a pair, according to the probability distribution of H 2 C nq, q [N 2=l 2 ]. If H M 1, then we choose a first graph uniformly at random from all subgraphs of H contained in M 1 l 1. If H / M 1, then with probability P 1 l 1 P 2 l 2 /Q 2 l 2 we choose a first graph uniformly at random from all subgraphs of H contained in M 1 l 1 and with probability Q 1 l 1 /Q 2 l 2 we choose a first graph uniformly at random from all subgraphs of H contained in M 2 l 1. According to this construction the first graph is chosen according to the probability distribution of H 2 M 1 M 2 and the second according to the probability distribution of H 2 C nq, q. Moreover µ{l 1, l 2 : l 1 > l 2 } = o 1. n In addition, the size of the maximum matching i.e. N 2 is at most the number of edges of H 2 C nq, q, which has binomial distribution with expected value C n 2 q 3 = Olnn. Thus by Chernoff bound 9 1 µ{l 1, l 2 : l 2 ωn lnn} = o n Therefore this is a desired coupling and it is well defined for large n if P 1 l 1 P 2 l 2 for large n and l 1 l 2. Calculations show that for a given l < ωn lnn and ωn tending slowly to infinity n l n 2 2 l! l 1 n i 2 Q 1 l 1 = 1 = n 2 i l i=0 l 1 2ni i2 l 1 = 1 1 1 n 2 i i=0 i=0 1 2l 2l2 n n 21

and Q 2 l = Pr{H 2 C nq, q [N 2=l] Pr{H2 = / M 1 } C nq, q [N 2=l] / M 1 } Pr{H 2 C nq, q [N 2=l] 1 Pr{H 2 C nq, q [N 2=l] / M 1 M 2 } / M 1 M 2 } Pr{H 2 C nq, q [N2=l] M 2 } Pr{H 2 C nq, q [N2=l] M 1 } + Pr{H 2 C nq, q [N2=l] M 2 } = Pr{H 2 C nq, q M 2 l} = Pr{H 2 C nq, q M 1 l} + Pr{H 2 C nq, q M 2 l} = = Ω nq 2 l = Ωn 1/3 l, = since Pr{H 2 C nq, q M 2 l} = C nq C nq l + 1 = l + 1 l 2 C 2 nq C nq l = l l + 1 l! l + 1l l! 2 l+1 q 1 q C nq 2 = 1 q q 1 q = Pr{H 2 C nq, q M 1 l}1 + o1 C nq 2 l 2 l q nq2 1 qc = 1 q Hence Q 1 l 1 = oq 2 l 2 uniformly over all 1 l 1 l 2 ωn lnn and ωn such that ωn lnn = onq. CASE 3 and 4 In this case we will use the fact that the number of edges in H 2 n, q 3 and H 2 C nq, q is sharply concentrated around its expected value. Therefore in order to compare H 2 n, q 3 and H 2 C nq, q we will compare the auxiliary random graphs. In fact we will construct a coupling of the degree sequences of some multigraphs of which H 2 n, q 3 and H 2 C nq, q are underlying graphs. By Fact 7 graphs H 2 n, q 3 and H 2 C nq, q are underlying graphs of multigraphs H x and T x with the Poisson distribution of the number of edges with expected value n 2 ln1 q 3 and C nq 2 ln1 q, respectively and an edge set constructed by independently choosing one by one with repetition edges from the edge set of the complete 2 partite graph with 2-partition X 1, X 2 and X 1, X 2, respectively. By Chernoff bound 10 H x 1 o1/n H x and T x 1 o1/n T x, where H x is the multigraph with C 1 n 2 q 3 C 1 > 1 is a constant edges and T x is the multigraph with C 2 n 2 q 3 edges where C /C > 1 is a constant. Notice that choosing one edge is equivalent to choosing its 2 vertices independently from each set of 2 partition. 22

Therefore we may independently choose the degree sequence in each set X i X i and then create the multigraph with a given degree sequence. Thus, by Fact 2, to prove that we have to prove for each X i, 1 i 2, that H x 1 o1/n T x, D 1 1,...,D1 n 1 o1/n D 5 1,...,D5 n, where D 1 j is the random variable denoting the degree of the j-th vertex in X i in H x and D 5 j is the random variable denoting the degree of the j-th vertex in X i in T x. We introduce the auxiliary urn models. Assume that we have n urns. Let D i = D i 1,...,D n i be a random vector in which D i j stays for a number of balls in j-th urn in the i th model. Let 1 < C 1 < C 2, C /C > 1, C /C > 1 and C /C > 1 be constants such that C 2 < C. In the 1 st model we throw C 1 n 2 q 3 balls one by one independently, with repetition, to the urn chosen uniformly at random from n urns. In the 2 nd model the number of thrown balls has Poisson distribution PoC 2 n 2 q 3, i.e. by Fact 7 D 2 j has Poisson distribution PoC 2 nq 3 for K = 2, P 2 = 1, n n 2 = n. In the 3 rd model D 3 j = D j D j, where D j is a Bernoulli random variable with probability of success C q and D j has Poisson distribution PoC nq 2. In the 4 th model first we select C nq urns from the set of all urns and the number of balls thrown to the selected urns has Poisson distribution PoC 2 n 2 q 3, i.e. for the urns not selected D 4 j = 0 and for the selected urns D 4 j has Poisson distribution PoC nq 2 by Fact 7 for K = 2, P 2 = 1, n C nq 2 = C nq. In the 5 th model first we select C nq urns and we throw C 2 n 2 q 3 balls one by one independently to the urn chosen uniformly at random from the set of selected urns. By Chernoff bound D 1 1 o1/n D 2 and D 3 1 o1/n D 4 1 o1/n D 5. Moreover, by Fact 2, if for large n 27 D 2 j 1 o1/n 2 D j D j, then for large n 28 D 2 1 o1/n D 3 The constants may be chosen such that 4, for nq 2 = o1; C = ω n, for nq 2 = Θ1; c nq 2, for nq 2, 23

where c > 1 and ω n is a function tending slowly to infinity. For large n { } Pr D 2 j 1 = 1 exp C 2 nq 3 C q 1 exp C nq 2 = { } = Pr D 3 j 1. Moreover, for t 2, nq 2 = o1 and large n { } Pr D 2 j t C 2nq 3 t = o C q C nq 2 t t! t! { } = o Pr D 3 j t This implies 27 for nq 2 = o1. Let now nq 2 = Ω1. By Chernoff bound, if we estimate the number of urns with at least one ball and compare it to the number of balls we will get, that with probability 1 o1/n the number of urns with at least 2 balls in the 3-rd model is on 2 q 3 and Ωn 2 q 3 in the 2-nd model. Therefore, since urns with at least 2 balls are uniformly distributed, the coupling is easy to construct. Proof of 24 Let C = Cq be defined as in 20. Define X n x 1, x 2 = {x X 3 : x 1 X1 x,...,x 2 X2 x}. X n x 1, x 2 has binomial distribution Binn, Cq 2. Then for cnq 2, where c > 25 for nq 2 = o1; EX n = C 2 nq 2 = ω 2 nnq 2, for nq 2 = Θn; cn 3 q 6, where c > 1, for nq 2. Therefore by Lemma 3 Pr{ x1,x 2 X n x 1, x 2 a n q} n2 Pr {X n x 1, x 2 a n q} = o1, where 3 ln n/lnln n lnnq 2, for nq 2 = o1; 3 ln n/ lnln n, for nq 2 = Θn; a n q = 3 ln n/lnln n lnn 3 q 6, for nq 2 and nq 2 = o 3 ln n; ω 1 n n 3 q 6, where ω 1 n for nq 2 and nq 2 = Θ 3 ln n; cn 3 q 6, where c > 1 for nq 2 and onq 2 = 3 ln n, since ωn may tend to infinity arbitrarily slowly. Thus x X k+1 T x 1 o1 H 2 n, a n qq. 24

The proof of the remaining two lemmas is analogous. However we only need to consider the analogues of CASE 2 and 3, which shortens the proof. Proof of Lemma 12 and 13. Let k = 4 or k = 5. Similarly to the previous proof we will divide the hypergraph H k n, q k 2 into n edge disjoint hypergraphs Hx, x X k, such that H k n, q k 2 = x X k Hx. For any x X k, Hx will be a hypergraph with the vertex set {x} k 1 i=1 X i and an edge set consisting of those edges from E H k n, q 2 k, which contain x. Tx will be an auxiliary hypergraph, which has the same vertex set as Hx and an edge set constructed by the following procedure. First we add each edge x, y, y k 1 i=1 X i independently with probability Cq C > 5 to the edge set and then independently with probability q k 1 2 we add to the edge set each edge x1,...,x k 1 X1... X k 1, where, for each 1 i k 1, Xi is the set of vertices connected by an edge with x. Therefore we have to prove that 29 GHx 1 o1/n GTx and that 30 x X k GTx 1 o1 G k n, ac k qc k q. To get 30 we will show that if T x is a subhypergraph of Tx induced on k 1 i=1 X i, then for c > 5k 1 31 n, cq k 1 2, x X k T x 1 o1 H k 1 where H k 1 n, cq k 1 2 is independent of choice of X i, and then we will use the lemma for k 1 to arrive at x X k GT x 1 o1 G k 1 n, a n c k qc k q, which by definition of Tx and T x implies 30. The proof of 29 in CASE 2 and 3 is similar to the proof of 22. We will only sketch the proof of 31. Define X n = X n x 1,..., x k 1 = {x X k : x 1 X1 x...x k 1 Xk 1 x}. It has binomial distribution Binn, Cq k 1 and for large n EX n = C 3 nq 3 n 1/5 and EX n = C 4 nq 4 n 3/5. Therefore, since ln n ln 5 and ln ln n ln n 1/5 25 n lnln n ln n 3/5 5 3,

by Lemma 3 for any constant c 4 > 15 and c 5 > 20/3 Pr{ x1,...,x k 1 X n x 1,...,x k 1 c k} n k 1 Pr {X n x 1,...,x k 1 c k} = o1. Thus in the case k = 4, for any constant c 4 > 3 15, we have x X 4 T x 1 o1 H 3 n, c 4 q3. Thus, by Lemma 11, Finally, since a n c 4q 5, x X 4 T x 1 o1 G 3 n, a n c 4qc 4q. GH 4 n, q 6 1 o1 G 4 n, a n c 4qc 4q. Analogously, for k = 5 and c > 6 20/3 T x 1 o1 H 4 n, cq 6. x X 5 Therefore by Lemma 12, for c 5 > 6 20/3 3 15 = 6 2 2 3 5 3 T x 1 o1 G 4 n, a n c 5qc 5q, x X 5 which implies the thesis. Acknowledgments I would like to thank Andrzej Ruciński for the suggestion to read the article [8]. References [1] A. D. Barbour, L. Holst, and S. Janson. Poisson Approximation. Oxford University Press, 1992. [2] M. Bloznelis, J. Jaworski, and K. Rybarczyk. Component evolution in a secure wireless sensor network. Networks, 531:19 26, 2009. [3] B. Bollobás. Random Graphs. Academic Press, 1985. [4] J. A. Fill, E. R. Scheinerman, and K. B. Singer-Cohen. Random intersection graphs when m = ωn: An equivalence theorem relating the evolution of the Gn, m, p and Gn, p models. Random structures and Algorithms, 16:156 176, 2000. 26

[5] E. Godehardt and J. Jaworski. Two models of random intersection graphs for classifcation. In Studies in Classifcation, Data Analysis and Knowledge Organization, pages 67 81. Springer, Berlin Heidelberg New York, 2003. [6] S. Janson, T. Luczak, and A. Ruciński. Random Graphs. Wiley, 2001. [7] M. Karoński, E. R. Scheinerman, and K.B. Singer-Cohen. On random intersection graphs: The subgraph problem. Combinatorics, Probability and Computing, 8:131 159, 1999. [8] J. H. Kim. Perfect matchings in random uniform hypergraphs. Random Structures and Algorithms, 232:111 223, 2003. [9] T. Luczak. On the equivalence of two basic models of random graphs. In Karoński M., Jaworski J., and Ruciński A., editors, Random Graphs 87, pages 151 158. John Wiley & Sons, 1990. [10] K. B. Singer-Cohen. Random intersection graphs. PhD thesis, Department of Mathematical Sciences, The Johns Hopkins University, 1995. [11] D. Stark and K. Rybarczyk. Poisson approximation of the number of cliques in random intersection graphs. submitted. 27