E0 4 Computational Complexity Theory Indian Institute of Science, Bangalore Fall 04 Department of Computer Science and Automation Lecture : Oct 9, 04 Lecturer: Chandan Saha <chandan@csa.iisc.ernet.in> Scribe: Pawan umar. Interactive proof for graph non-isomorphism In this lecture we will show that the language GNI is in class IP. Two graphs G and G are isomorphic if there is a permutation π of the labels of the nodes of G such that π(g ) G, where π(g ) is the labeled graph obtained by applying π on its vertex labels. Definition.. GNI {< G, G >: G G i.e. G and G are nonisomorphic} Definition.. GI {< G, G >: G G i.e. G and G are isomorphic} Claim: GI NP Proof. The certificate is the description of permutation π. One can apply the permutation π on the vertices of G and check whether π(g ) G in polynomial time. Interactive protocol for GNI Input: Adjacency matrices of G and G. (w.l.o.g. let n be the no. of vertices in G &G ) V: Verifier picks i R {, } and a random π S n, where set S n contains all permutations of first n natural numbers. Computes H π(g i ), sends H to prover. P: Prover sends j {, } to V after seeing H. V: If i j then accept else reject. observation: If (G, G ) GNI then P s.t. P r [Out V < V, P > (G, G ) ] If (G, G ) GNI then P P r [Out V < V, P > (G, G ) ] Remark It appears the fact that verifier is keeping it s random coins secret is crucial. Class IP was defined in a work by Goldwasser, Micali, Rackoff in 985. Laszlo Babai defined the classes AM(& MA) using public coins. Definition.. Class AM[k] (Arthur-Merlin) : For any k N, AM[k] is a subclass of IP[k], where the verifier only sends random strings to the prover and it is not allowed to use any other random bits that has not been revealed to the prover. Class AM[] is also denoted by AM. -
Lecture : Oct 9, 04 - Definition.4. Class MA (Merlin-Arthur) : It is the class of languages with a two round public-coin interactive proof with the prover sending first message. Lemma : For every k>0, AM[k] AM Recall: BP.NP {L : L R SAT } Lemma : AM BP.NP Proof. AM BP.NP : Suppose L AM, we need to show that L BP.NP. Let x L, for fixed input x V picks a random string r and send r to P. Upon receiving r prover sends a g(x, r) to verifier. Now V runs polytime algorithm say f(x, r, a). If x L then If x L then P s.t. P r r [Out V < V, P > (x, a, r) ] P P r r [Out V < V, P > (x, a, r) ] Note that for any strings x, r the execution between verifier and prover can be interpreted as non-deterministic computation such that V on input (x, r) has access to some witness a (provided by P), which is checked by the polytime V. That is the language L {(x, r) : a s.t. V (x, r, a) } is in NP, and therefore there exist a formula φ x,r such that (x, r) L φ x,r SAT. Observe that Out V < V, P > (x, a, r) if & only if (x, r) L φ x,r SAT. Hence x L P r r [φ x,r SAT ] x L P r r [φ x,r SAT ] Hence, L BP.NP BP.NP AM : Suppose L BP.NP, we need to show that L AM. Since L BP.NP there is a polytime algorithm f for constructing a formula φ x,r f(x, r) such that for every string x x L P r r [φ x,r SAT ] x L P r r [φ x,r SAT ] The -round protocol for deciding L is as follows: The verifier sends to the prover a random string r, and the prover replies with a satisfying assignment for φ x,r. At the end, the verifier checks that indeed the assignment is satisfying for φ x,r. Theorem.5. (Goldwasser-Sipser) For every k : N N, with k(n) computable in polytime, IP[k] AM[k+] We ll now show an AM protocol for GNI. Claim : Define the following set for two graphs G and G. S {(H, π) : H G orh G and π auto(h) } where π is an automorphism of H. Case : If G G then S n! Case : If G G then S n! Proof. For an n-vertex graph consider the multiset all(g) {π (G),..., π n! (G)} of all permuted version of G. This is indeed a multi-set since it is possible that π i (G) π j (G) even when π i π j. Let auto(g) {π π(g) G} be
Lecture : Oct 9, 04 - the automorphisms of G. Let iso(g) be the set {π(g) π is a permutation}. We claim that for any n-vertex graph G we have: auto(g). iso(g) n! The reason is that our original set all(g) has exactly n! elements in it, but each graph in iso(g) appears exactly auto(g) times in all(g) (because auto(g) auto(π(g)) for any permutation π) Note that if G G then H isomorphic to G it is isomorphic to G ; also the number of automorphisms of any such H is exactly auto(g ). So the size of S is exactly auto(g ). iso(g ) n!. On the other hand, if G G then the graphs isomorphic to G are distinct from those graphs isomorphic to G. So the size of S in this case is auto(g ). iso(g ) + auto(g ). iso(g ) n! Definition.6. Pairwise independent hash functions: Let H m, be a collection of functions from {0, } m to {0, }. H m, is pairwise independent if x, x {0, } m with x x and y, y {0, }, P r {h(x) y and h(x ) y }. m, Protocol: Goldwasser-Sipser Set Lower Bound Protocol Notations: Let S {0, } m be a set such that membership in S can be certified efficiently. The prover s goal is to convince the verifier that S and if S then verifier will reject with high probability, where n! and be such that < V: Verifier picks a random h H m, H(say), picks y R {0, } and sends h, y to prover P. P: Prover returns an x {0, } m and a z (an honest prover returns an x in S s.t. h(x) y if such an x exists and z certifies that x S). V: If h(x) y and z certifies that x S then accept; otherwise reject. Theorem.7. GNI AM Proof. We need to show that there exists a -round protocol s.t. if < G, G > GNI i.e. G G then probability of acceptance is high. if < G, G > GNI i.e. G G then probability of acceptance is low. The protocol is defined above. By using the above claim we compute the acceptance probability in two cases: Case : If S n! P r { x S, s.t. h(x) y} n! + (.) y R {0,} Case : If S n! P r { x S, s.t. h(x) y}? (.) y R {0,} Now we fix y arbitrarily and compute the probability P r { x S, s.t. h(x) y} Let E x be the event that h(x) y, according to inclusion exclusion principle { } P r E x P r {E x } P r {E x E x } (.) x x x x
Lecture : Oct 9, 04-4 Now put the value of euation.6 in euation. P r {E x } (.4) P r {E x E x } (as h is picked from H m,) (.5) P r x S x x (.6) P r { x S, s.t. h(x) y} x S x x S S.. ( + ) ( + ) 4 Lemma: Let p S then 4 p P r { x, h(x) y} p y R {0,} Note: If we repeat the lower bound protocol independently M times, where M is in poly( x ), we can tightly bound the probability of acceptance by using Chernoff bound. Case : If S i.e. G G and verifier accepts then this is bad event P r [ Bad Event ] P r [ V accepts ] Case : If S i.e. G G and verifier accepts then this is good event Remark: For single iteration if case then P r [ V accepts ] if case then P r [ V accepts ] 4. P r [ Good Event ] P r [ V accepts ] 4. Let X i be the indicator random variable defined as below, { if V accepts in i X i th iteration 0 if V rejects in i th iteration
Lecture : Oct 9, 04-5 Let X M i X i, For case E [X] For case E [X] 4 P r [X i ] P r [V accepts] [ M ] E [X] E X i M M i M E [X i ] (linearity of expectation as X is are iids) i M P (X i ) i If the expected value is close to M then output G G, if expected value is close to 4 M then output G G. We know that for case expected value is less than eual to M and the error probability is P r [X > ( + δ)e [X]]. In case expected value is greater than eual to 4 M and the error probability is P r [X < ( δ)e [X]]. We ll apply Chernoff bound to restrict these error probabilities as follows For case P r [X > ( + δ)e [X]] e E[X]δ For case P r [X < ( δ)e [X]] e E[X]δ We need to upper bound the error probability in both the cases. Case : In this case P r [Error] P r [ X > ( + δ) M ] and E [X] M we can not directly apply the Chernoff bound because if E [X] 0 then P r [Error] which is obvious and is of no use. Hence we ll apply the Markov s ineuality. By Markov s ineuality [ P r [Error] P r X > ( + δ) ] M E [X] ( + δ) M If E [X] M then Pr[Error] Else i.e. (E [X] M) we need to apply the chernoff bound
Lecture : Oct 9, 04-6 E [X] M ( + δ)e [X] ( + δ) M if X > ( + δ) M then X > ( + δ)e [X] [ P r X > ( + δ) ] M P r [X < ( + δ)e [X]] e E[X]δ (using chernoff bound) e E[X]δ e M δ 9 as E [X] M Remark: By increasing the number of rounds i.e. M we can decrease the error probability. Error probability for this case say EP P r [Error] min(, ) e C.M, where C δ 9 is constant. Case : In this case P r [Error] P r [ X < ( δ) 4 M ] E [X] 4 M ( δ)e [X] ( δ) 4 M if X < ( δ) M then X < ( δ)e [X] 4 [ P r X < ( δ) ] 4 M P r [X < ( δ)e [X]] e E[X]δ (using chernoff bound) e E[X]δ e 4 M δ Remark: Error probability for this case say EP P r [Error], where C e C.M 4 overall error probability P r [Error] max(ep, EP ) Choose a δ such that ( + δ) + M < ( δ) + M δ is constant. Hence the ( + δ) < ( δ) ( + δ) ( δ) < For example, δ 0 suffices. Lemma: If GI is NP-Complete then PH collapses.
Lecture : Oct 9, 04-7 Proof. Let us assume GI NP-Complete GNI Co NP C SAT P GNI SAT BP.NP (as GNI AM BP.NP ) SAT R SAT Co NP BP.NP NP /poly Note: Assignment Problem If Co-NP NP /poly then PH collapses to P. (This is also known as Yap s theorem). References [M] S. ARORA and B. BARA Computational Complexity: A Mordern Approach, Cambridge University Press, 009