Planted Cliques, Iterative Thresholding and Message Passing Algorithms

Size: px
Start display at page:

Download "Planted Cliques, Iterative Thresholding and Message Passing Algorithms"

Transcription

1 Planted Cliques, Iterative Thresholding and Message Passing Algorithms Yash Deshpande and Andrea Montanari Stanford University November 5, 2013 Deshpande, Montanari Planted Cliques November 5, / 49

2 Problem Definition Given distributions Q 0 ; Q 1, A Set! S [n] ( Q1 if i; j 2 S Data! A ij otherwise. A ij = A ji Q 0 Problem: Given A, identify S Deshpande, Montanari Planted Cliques November 5, / 49

3 Problem Definition Given distributions Q 0 ; Q 1, S A Set! S [n] ( Q1 if i; j 2 S S Q 1 Data! A ij Q 0 otherwise. A = Q 0 A ij = A ji Problem: Given A, identify S Deshpande, Montanari Planted Cliques November 5, / 49

4 Problem Definition Given distributions Q 0 ; Q 1, S A Set! S [n] ( Q1 if i; j 2 S S Q 1 Data! A ij Q 0 otherwise. A = Q 0 A ij = A ji Problem: Given A, identify S Deshpande, Montanari Planted Cliques November 5, / 49

5 An Example S S Q 1 Q 1 = N(; 1) Q 0 = N(0; 1): A = Q 0 Deshpande, Montanari Planted Cliques November 5, / 49

6 An Example A = u S u T S + Z λ λ λ λ 0 N(0, 1) 0 0 Data = Sparse, Low Rank + noise Deshpande, Montanari Planted Cliques November 5, / 49

7 An Example A = u S u T S + Z λ λ λ λ 0 N(0, 1) 0 0 Data = Sparse, Low Rank + noise Deshpande, Montanari Planted Cliques November 5, / 49

8 Much work in statistics Denoising: y = x + noise [Donoho, Jin 2004], [Arias-Castro, Candes, Durand 2011] [Arias-Castro, Bubeck, Lugosi 2012] Sparse signal recovery: y = Ax + noise [Candes, Romberg, Tao 2004], [Chen, Donoho 1998] The simple example combines different structures Deshpande, Montanari Planted Cliques November 5, / 49

9 Much work in statistics Denoising: y = x + noise [Donoho, Jin 2004], [Arias-Castro, Candes, Durand 2011] [Arias-Castro, Bubeck, Lugosi 2012] Sparse signal recovery: y = Ax + noise [Candes, Romberg, Tao 2004], [Chen, Donoho 1998] The simple example combines different structures Deshpande, Montanari Planted Cliques November 5, / 49

10 Much work in statistics Denoising: y = x + noise [Donoho, Jin 2004], [Arias-Castro, Candes, Durand 2011] [Arias-Castro, Bubeck, Lugosi 2012] Sparse signal recovery: y = Ax + noise [Candes, Romberg, Tao 2004], [Chen, Donoho 1998] The simple example combines different structures Deshpande, Montanari Planted Cliques November 5, / 49

11 Our Running Example: Planted Cliques A is adjacency matrix Q 1 = +1 Q 0 = : [Jerrum, 1992] S forms a clique in the graph Deshpande, Montanari Planted Cliques November 5, / 49

12 Our Running Example: Planted Cliques A is adjacency matrix Q 1 = +1 Q 0 = : [Jerrum, 1992] S forms a clique in the graph Deshpande, Montanari Planted Cliques November 5, / 49

13 Our Running Example: Planted Cliques Average case version of MAX-CLIQUE Communities in networks Deshpande, Montanari Planted Cliques November 5, / 49

14 Our Running Example: Planted Cliques Average case version of MAX-CLIQUE Communities in networks Deshpande, Montanari Planted Cliques November 5, / 49

15 How large should S be? Let jsj = k Size of largest clique in G n; 1 2 Second moment calculation ) k > 2 log 2 n. Deshpande, Montanari Planted Cliques November 5, / 49

16 How large should S be? Let jsj = k Size of largest clique in G n; 1 2 Second moment calculation ) k > 2 log 2 n. Deshpande, Montanari Planted Cliques November 5, / 49

17 Progress(0) Complexity n 2 log n 2 log 2 n k Exhaustive search Deshpande, Montanari Planted Cliques November 5, / 49

18 A Naive Algorithm Pick k largest degree vertices of G as b S. Deshpande, Montanari Planted Cliques November 5, / 49

19 A Naive Algorithm Pick k largest degree vertices of G as b S. Deshpande, Montanari Planted Cliques November 5, / 49

20 Analysis of NAIVE If i =2 S: If i 2 S: deg(i) = Binomial n 1 1; 2 ) max deg(i) n q i =2S 2 + O( n log n) deg(i) = k 1 + Binomial n k + 1 1; 2 q ) min deg(i) k 1 + n i2s 2 2 O( n log n) NAIVE works if: k O( p n log n) Deshpande, Montanari Planted Cliques November 5, / 49

21 Analysis of NAIVE If i =2 S: If i 2 S: deg(i) = Binomial n 1 1; 2 ) max deg(i) n q i =2S 2 + O( n log n) deg(i) = k 1 + Binomial n k + 1 1; 2 q ) min deg(i) k 1 + n i2s 2 2 O( n log n) NAIVE works if: k O( p n log n) Deshpande, Montanari Planted Cliques November 5, / 49

22 Analysis of NAIVE If i =2 S: If i 2 S: deg(i) = Binomial n 1 1; 2 ) max deg(i) n q i =2S 2 + O( n log n) deg(i) = k 1 + Binomial n k + 1 1; 2 q ) min deg(i) k 1 + n i2s 2 2 O( n log n) NAIVE works if: k O( p n log n) Deshpande, Montanari Planted Cliques November 5, / 49

23 Analysis of NAIVE If i =2 S: If i 2 S: deg(i) = Binomial n 1 1; 2 ) max deg(i) n q i =2S 2 + O( n log n) deg(i) = k 1 + Binomial n k + 1 1; 2 q ) min deg(i) k 1 + n i2s 2 2 O( n log n) NAIVE works if: k O( p n log n) Deshpande, Montanari Planted Cliques November 5, / 49

24 Progress(1) Complexity n 2 log n n 2 2 log 2 n n log n k [Kučera, 1995] Deshpande, Montanari Planted Cliques November 5, / 49

25 Spectral Method A = u S u T S + Z ±1 0 0 ±1 ±1 Hopefully v 1 (A) u S Deshpande, Montanari Planted Cliques November 5, / 49

26 Spectral Method A = u S u T S + Z ±1 0 0 ±1 ±1 Hopefully v 1 (A) u S Deshpande, Montanari Planted Cliques November 5, / 49

27 Analysis of SPECTRAL By standard linear algebra: 1 p n A = kpn {z } e S e T S + Z p n : p Z 2 2 =) hv 1 (A); e S i 1 () n SPECTRAL works if k C p n. Deshpande, Montanari Planted Cliques November 5, / 49

28 Progress(2) Complexity n 2 log n n 2 log n n 2 2 log 2 n C n n log n k [Alon, Krivelevich and Sudakov, 1998] Deshpande, Montanari Planted Cliques November 5, / 49

29 Progress(2) Complexity n 2 log n n 2 log n n 2 2 log 2 n C n n log n k [Ames, Vavasis 2009], [Feige, Krauthgamer 2000] Deshpande, Montanari Planted Cliques November 5, / 49

30 Progress(3) Complexity n 2 log n n 2 log n n 2 2 log 2 n 1.65 n C n n log n k [Dekel, Gurel-Gurevich and Peres, 2010] Deshpande, Montanari Planted Cliques November 5, / 49

31 Progress(3) Complexity n 2 log n n 2 log n n 2 2 log 2 n n C n n log n k [Dekel, Gurel-Gurevich and Peres, 2010] Deshpande, Montanari Planted Cliques November 5, / 49

32 Progress(3) Complexity n 2 log n n r+2 n 2 log n n 2 2 log 2 n C n 2 r n C n n log n k [Alon, Krivelevich and Sudakov, 1998] Deshpande, Montanari Planted Cliques November 5, / 49

33 Phase transition in spectrum of A Deshpande, Montanari Planted Cliques November 5, / 49

34 Phase transition in spectrum of A If k (1 + ") p n Limiting Spectral Density 2 2 hv 1 (A); e S i (") Deshpande, Montanari Planted Cliques November 5, / 49

35 Phase transition in spectrum of A If k (1 + ") p n If k (1 ") p n Limiting Spectral Density Limiting Spectral Density hv 1 (A); e S i (") hv 1 (A); e S i n 1=2+0 (") Deshpande, Montanari Planted Cliques November 5, / 49

36 Phase transition in spectrum of A If k (1 + ") p n If k (1 ") p n Limiting Spectral Density Limiting Spectral Density hv 1 (A); e S i (") hv 1 (A); e S i n 1=2+0 (") [Knowles, Yin, 2011] Deshpande, Montanari Planted Cliques November 5, / 49

37 Seems much harder than it looks! Statistical algorithms fail if k = n 1=2 : [Feldman et al., 2012] r-lovász-schrijver fails for k p n=2 r : [Feige, Krauthgamer, 2002] r-lasserre fails for k p n (log n) r2 : [Widgerson and Meka, 2013] Deshpande, Montanari Planted Cliques November 5, / 49

38 Seems much harder than it looks! Statistical algorithms fail if k = n 1=2 : [Feldman et al., 2012] r-lovász-schrijver fails for k p n=2 r : [Feige, Krauthgamer, 2002] r-lasserre fails for k p n (log n) r2 : [Widgerson and Meka, 2013] Deshpande, Montanari Planted Cliques November 5, / 49

39 Seems much harder than it looks! Statistical algorithms fail if k = n 1=2 : [Feldman et al., 2012] r-lovász-schrijver fails for k p n=2 r : [Feige, Krauthgamer, 2002] r-lasserre fails for k p n (log n) r2 : [Widgerson and Meka, 2013] Deshpande, Montanari Planted Cliques November 5, / 49

40 Our result Theorem (Deshpande, Montanari, 2013) If jsj = k (1 + ") p n=e, there exists an O(n 2 log n) time algorithm that identifies S with high probability. I will present: 1 A (wrong) heuristic analysis 2 How to fix the heuristic 3 Lower bounds Deshpande, Montanari Planted Cliques November 5, / 49

41 Our result Theorem (Deshpande, Montanari, 2013) If jsj = k (1 + ") p n=e, there exists an O(n 2 log n) time algorithm that identifies S with high probability. I will present: 1 A (wrong) heuristic analysis 2 How to fix the heuristic 3 Lower bounds Deshpande, Montanari Planted Cliques November 5, / 49

42 Our result Theorem (Deshpande, Montanari, 2013) If jsj = k (1 + ") p n=e, there exists an O(n 2 log n) time algorithm that identifies S with high probability. I will present: 1 A (wrong) heuristic analysis 2 How to fix the heuristic 3 Lower bounds Deshpande, Montanari Planted Cliques November 5, / 49

43 Iterative Thresholding The power iteration: v t+1 = A v t : Improvement: v t+1 = AF t (v t ): where F t (v) = (f t (v 1 ); f t (v 2 ); ; f t (v n )) T Choose f t ( ) to exploit sparsity of e S Deshpande, Montanari Planted Cliques November 5, / 49

44 Iterative Thresholding The power iteration: v t+1 = A v t : Improvement: v t+1 = AF t (v t ): where F t (v) = (f t (v 1 ); f t (v 2 ); ; f t (v n )) T Choose f t ( ) to exploit sparsity of e S Deshpande, Montanari Planted Cliques November 5, / 49

45 Iterative Thresholding The power iteration: v t+1 = A v t : Improvement: v t+1 = AF t (v t ): where F t (v) = (f t (v 1 ); f t (v 2 ); ; f t (v n )) T Choose f t ( ) to exploit sparsity of e S Deshpande, Montanari Planted Cliques November 5, / 49

46 Iterative Thresholding The power iteration: v t+1 = A v t : Improvement: v t+1 = AF t (v t ): where F t (v) = (f t (v 1 ); f t (v 2 ); ; f t (v n )) T Choose f t ( ) to exploit sparsity of e S Deshpande, Montanari Planted Cliques November 5, / 49

47 Iterative Thresholding The power iteration: v t+1 = A v t : Improvement: v t+1 = AF t (v t ): where F t (v) = (f t (v 1 ); f t (v 2 ); ; f t (v n )) T Choose f t ( ) to exploit sparsity of e S Deshpande, Montanari Planted Cliques November 5, / 49

48 (Wrong) Analysis v t+1 i = 1 p n X j A ij f t (v t j ): A ij are random 1 r.v. Use Central Limit Theorem for v t+1 i Deshpande, Montanari Planted Cliques November 5, / 49

49 (Wrong) Analysis v t+1 i = 1 p n X j A ij f t (v t j ): A ij are random 1 r.v. Use Central Limit Theorem for v t+1 i Deshpande, Montanari Planted Cliques November 5, / 49

50 (Wrong) Analysis If i =2 S: v t+1 i = 1 p n X N 1 0; n j A ij f t (v t j ) X j f t (v t j ) 2 Letting v t i N(0; 2 t )... 2 t+1 = 1 n X j f t (v t j ) 2 = Eff t ( t ) 2 g; N(0; 1) Deshpande, Montanari Planted Cliques November 5, / 49

51 (Wrong) Analysis If i =2 S: v t+1 i = 1 p n X N 1 0; n j A ij f t (v t j ) X j f t (v t j ) 2 Letting v t i N(0; 2 t )... 2 t+1 = 1 n X j f t (v t j ) 2 = Eff t ( t ) 2 g; N(0; 1) Deshpande, Montanari Planted Cliques November 5, / 49

52 (Wrong) Analysis If i =2 S: v t+1 i = 1 p n X N 1 0; n j A ij f t (v t j ) X j f t (v t j ) 2 Letting v t i N(0; 2 t )... 2 t+1 = 1 n X j f t (v t j ) 2 = Eff t ( t ) 2 g; N(0; 1) Deshpande, Montanari Planted Cliques November 5, / 49

53 (Wrong) Analysis If i =2 S: v t+1 i = 1 p n X N 1 0; n j A ij f t (v t j ) X j f t (v t j ) 2 Letting v t i N(0; 2 t )... 2 t+1 = 1 n X j f t (v t j ) 2 = Eff t ( t ) 2 g; N(0; 1) Deshpande, Montanari Planted Cliques November 5, / 49

54 (Wrong) Analysis If i 2 S: vi t+1 = p 1 X n f t (v t j ) j2s {z } t+1 + p 1 X n A ij f t (v t j ) j =2S {z } N(0; t+1 2 ) where t+1 = p 1 X f t (v t n j ) = j2s kpn 1 k X j2s = Eff t ( t + t )g; N(0; 1) f t (vj t ) Deshpande, Montanari Planted Cliques November 5, / 49

55 (Wrong) Analysis If i 2 S: vi t+1 = p 1 X n f t (v t j ) j2s {z } t+1 + p 1 X n A ij f t (v t j ) j =2S {z } N(0; t+1 2 ) where t+1 = p 1 X f t (v t n j ) = j2s kpn 1 k X j2s = Eff t ( t + t )g; N(0; 1) f t (vj t ) Deshpande, Montanari Planted Cliques November 5, / 49

56 (Wrong) Analysis If i 2 S: vi t+1 = p 1 X n f t (v t j ) j2s {z } t+1 + p 1 X n A ij f t (v t j ) j =2S {z } N(0; t+1 2 ) where t+1 = p 1 X f t (v t n j ) = j2s kpn 1 k X j2s = Eff t ( t + t )g; N(0; 1) f t (vj t ) Deshpande, Montanari Planted Cliques November 5, / 49

57 (Wrong) Analysis If i 2 S: vi t+1 = p 1 X n f t (v t j ) j2s {z } t+1 + p 1 X n A ij f t (v t j ) j =2S {z } N(0; t+1 2 ) where t+1 = p 1 X f t (v t n j ) = j2s kpn 1 k X j2s = Eff t ( t + t )g; N(0; 1) f t (vj t ) Deshpande, Montanari Planted Cliques November 5, / 49

58 Summarizing... v t i, i / S v t i, i S Deshpande, Montanari Planted Cliques November 5, / 49

59 State Evolution t+1 = E ff t ( t + t )g 2 t+1 = Eff t ( t ) 2 g: Using the optimal function f t (x) = e tx 2 t t+1 = e 2 t =2 2 t+1 = 1 Deshpande, Montanari Planted Cliques November 5, / 49

60 State Evolution t+1 = E ff t ( t + t )g 2 t+1 = Eff t ( t ) 2 g: Using the optimal function f t (x) = e tx 2 t t+1 = e 2 t =2 2 t+1 = 1 Deshpande, Montanari Planted Cliques November 5, / 49

61 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

62 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

63 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

64 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

65 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

66 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

67 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

68 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

69 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

70 Fixed points develop below threshold! If > 1 p e µ t+1 µ t Deshpande, Montanari Planted Cliques November 5, / 49

71 Fixed points develop below threshold! If > 1 p e If < 1 p e µ t+1 µ t+1 µ t µ t Deshpande, Montanari Planted Cliques November 5, / 49

72 Fixed points develop below threshold! If > 1 p e If < 1 p e µ t+1 µ t+1 µ t µ t Deshpande, Montanari Planted Cliques November 5, / 49

73 Fixed points develop below threshold! If > 1 p e If < 1 p e µ t+1 µ t+1 µ t µ t Deshpande, Montanari Planted Cliques November 5, / 49

74 Fixed points develop below threshold! If > 1 p e If < 1 p e µ t+1 µ t+1 µ t µ t Deshpande, Montanari Planted Cliques November 5, / 49

75 Fixed points develop below threshold! If > 1 p e If < 1 p e µ t+1 µ t+1 µ t µ t Deshpande, Montanari Planted Cliques November 5, / 49

76 Analysis is wrong but... Theorem (Deshpande, Montanari, 2013) If jsj = k (1 + ") p n=e, there exists an O(n 2 log n) time algorithm that identifies S with high probability.... so we modify the algorithm. Deshpande, Montanari Planted Cliques November 5, / 49

77 What algorithm? Slight modification to iterative scheme: (v t i ) i2[n]! (v t i!j) i ;j2[n] v t+1 i!j = 1 p n X `6=i ;j A i ` f t (v t`!i): Analysis is exact as n! 1. Deshpande, Montanari Planted Cliques November 5, / 49

78 What algorithm? Slight modification to iterative scheme: (v t i ) i2[n]! (v t i!j) i ;j2[n] v t+1 i!j = 1 p n X `6=i ;j A i ` f t (v t`!i): Analysis is exact as n! 1. Deshpande, Montanari Planted Cliques November 5, / 49

79 What algorithm? Slight modification to iterative scheme: (v t i ) i2[n]! (v t i!j) i ;j2[n] v t+1 i!j = 1 p n X `6=i ;j A i ` f t (v t`!i): Analysis is exact as n! 1. Deshpande, Montanari Planted Cliques November 5, / 49

80 What algorithm? Slight modification to iterative scheme: (v t i ) i2[n]! (v t i!j) i ;j2[n] v t+1 i!j = 1 p n X `6=i ;j A i ` f t (v t`!i): Analysis is exact as n! 1. Deshpande, Montanari Planted Cliques November 5, / 49

81 Fixing the heuristic Lemma Let (f t (z)) t0 be a sequence of polynomials. Then, for every fixed t, and bounded, continuous function : R! R the following limit holds in probability: 1 X p n where N(0; 1). lim n!1 1 lim n!1 n i2s X i2[n]ns (v t i!j) = Ef ( t + t )g; (v t i!j) = Ef ( t )g; Deshpande, Montanari Planted Cliques November 5, / 49

82 Proof Technique Key ideas: Expand v t i!j for polynomial f t ( ) Wrong analysis works if A! A t Deshpande, Montanari Planted Cliques November 5, / 49

83 Proof Technique Key ideas: Expand v t i!j for polynomial f t ( ) Wrong analysis works if A! A t Deshpande, Montanari Planted Cliques November 5, / 49

84 Proof Technique Key ideas: Expand v t i!j for polynomial f t ( ) Wrong analysis works if A! A t Deshpande, Montanari Planted Cliques November 5, / 49

85 Proof Technique - Expanding v t Let f t (x) = x 2 ; v 0 i!j = 1 j v 1 i!j = X k6=j A ik : i k Deshpande, Montanari Planted Cliques November 5, / 49

86 Proof Technique - Expanding v t j X vi!j 2 = A ik (vk!i) 1 2 k6=j X = A ik k6=j X`6=i = X k6=j X X `6=i m6=i A k ` X m6=i A ik A k `A km : A km l i k m Deshpande, Montanari Planted Cliques November 5, / 49

87 Proof Technique - Expanding v t j X vi!j 2 = A ik (vk!i) 1 2 k6=j X = A ik k6=j X`6=i = X k6=j X X `6=i m6=i A k ` X m6=i A ik A k `A km : A km l i k m Deshpande, Montanari Planted Cliques November 5, / 49

88 Proof Technique - Expanding v t j X vi!j 2 = A ik (vk!i) 1 2 k6=j X = A ik k6=j X`6=i = X k6=j X X `6=i m6=i A k ` X m6=i A ik A k `A km : A km l i k m Deshpande, Montanari Planted Cliques November 5, / 49

89 Proof Technique v t+1 i!j = X k6=i A ik f t (v t k!i): t+1 i!j = X k6=i A t ikf t ( t k!i): i i k k l m o l m o p q r p q r s w s w Deshpande, Montanari Planted Cliques November 5, / 49

90 Proof Technique - a Combinatorial Lemma Lemma v t i!j = X T2T t i!j A(T )Γ(T )v 0 (T ) where T t i!j consists rooted, labeled trees that: 1 have maximum depth t. 2 do not backtrack. (Similarly for the t i!j ) Deshpande, Montanari Planted Cliques November 5, / 49

91 Proof Technique - Moment Method v t+1 i!j = X k6=i A ik f t (v t k!i): t+1 i!j = X k6=i A t ikf t ( t k!i): i i k k l m o l m o p q r p q r s w s w lim n!1 Moments of v t+1 = Moments of t+1. Deshpande, Montanari Planted Cliques November 5, / 49

92 Proof Technique - Moment Method v t+1 i!j = X k6=i A ik f t (v t k!i): t+1 i!j = X k6=i A t ikf t ( t k!i): i i k k l m o l m o p q r p q r s w s w lim n!1 Moments of v t+1 = Moments of t+1. Deshpande, Montanari Planted Cliques November 5, / 49

93 Progress(4) Complexity n 2 log n Spectral threshold = n n 2 log n n 2 2 log 2 n n e n C n k 2 n log n Deshpande, Montanari Planted Cliques November 5, / 49

94 Is this threshold fundamental? Rest of the talk: perhaps Deshpande, Montanari Planted Cliques November 5, / 49

95 The Hidden Set Problem Given G n = ([n]; E n ) A ij Q 1 A Set! S [n] Data! A ij ( Q1 if i; j 2 S; Q 0 otherwise. A ij Q 0 Problem: Given edge labels (A ij ) (i ;j)2e n, identify S Deshpande, Montanari Planted Cliques November 5, / 49

96 The Hidden Set Problem Given G n = ([n]; E n ) A ij Q 1 A Set! S [n] Data! A ij ( Q1 if i; j 2 S; Q 0 otherwise. A ij Q 0 Problem: Given edge labels (A ij ) (i ;j)2e n, identify S Deshpande, Montanari Planted Cliques November 5, / 49

97 The Hidden Set Problem Given G n = ([n]; E n ) A Set! S [n] ( Q1 if i; j 2 S; Data! A ij otherwise. Q 0 S A = S Problem: Given edge labels (A ij ) (i ;j)2e n, identify S Deshpande, Montanari Planted Cliques November 5, / 49

98 Local Algorithms A t-local algorithm computes: Estimate at i: bu(i) = F(A Ball(i ;t)) Deshpande, Montanari Planted Cliques November 5, / 49

99 The Sparse Graph Analogue G n = ([n]; E n ); n 1 satisfies: locally tree-like regular degree Further Q 1 = +1, Q 0 = A ij Q 1 A ij Q 0 What can local algorithms achieve? Deshpande, Montanari Planted Cliques November 5, / 49

100 The Sparse Graph Analogue G n = ([n]; E n ); n 1 satisfies: locally tree-like regular degree Further Q 1 = +1, Q 0 = A ij Q 1 A ij Q 0 What can local algorithms achieve? Deshpande, Montanari Planted Cliques November 5, / 49

101 The Sparse Graph Analogue G n = ([n]; E n ); n 1 satisfies: locally tree-like regular degree Further Q 1 = +1, Q 0 = A ij Q 1 A ij Q 0 What can local algorithms achieve? Deshpande, Montanari Planted Cliques November 5, / 49

102 The Sparse Graph Analogue G n = ([n]; E n ); n 1 satisfies: locally tree-like regular degree Further Q 1 = +1, Q 0 = A ij Q 1 A ij Q 0 What can local algorithms achieve? Deshpande, Montanari Planted Cliques November 5, / 49

103 What can we hope for? If jsj = C n p : bs naive = Random set of size jsj ) 1 n Ef Snaive b 1 4Sg = Θ p : Deshpande, Montanari Planted Cliques November 5, / 49

104 What can we hope for? If jsj = C n p : Poisson bound ) for any local algorithm: 1 n Ef S4Sg b e C 0 p : Deshpande, Montanari Planted Cliques November 5, / 49

105 A result for local algorithms... Theorem (Deshpande, Montanari, 2013) Let G n converge locally to regular tree: If jsj (1 + ") n p e there exists a local algorithm achieving 1 p n EfS4 Sg b e Θ( ): Conversely, if jsj (1 ") n p e every local algorithm suffers 1 n EfS4 Sg b 1 Θ p With = n 1 we recover the complete graph result! Deshpande, Montanari Planted Cliques November 5, / 49

106 A result for local algorithms... Theorem (Deshpande, Montanari, 2013) Let G n converge locally to regular tree: If jsj (1 + ") n p e there exists a local algorithm achieving 1 p n EfS4 Sg b e Θ( ): Conversely, if jsj (1 ") n p e every local algorithm suffers 1 n EfS4 Sg b 1 Θ p With = n 1 we recover the complete graph result! Deshpande, Montanari Planted Cliques November 5, / 49

107 A result for local algorithms... Theorem (Deshpande, Montanari, 2013) Let G n converge locally to regular tree: If jsj (1 + ") n p e there exists a local algorithm achieving 1 p n EfS4 Sg b e Θ( ): Conversely, if jsj (1 ") n p e every local algorithm suffers 1 n EfS4 Sg b 1 Θ p With = n 1 we recover the complete graph result! Deshpande, Montanari Planted Cliques November 5, / 49

108 To conclude... Message-passing algorithm performs weighted counts of non-reversing trees Such structures have been used elsewhere: Clustering sparse networks: [Krzakala et al. 2013] Compressed sensing: [Bayati, Lelarge, Montanari 2013] Deshpande, Montanari Planted Cliques November 5, / 49

109 To conclude... Message-passing algorithm performs weighted counts of non-reversing trees Such structures have been used elsewhere: Clustering sparse networks: [Krzakala et al. 2013] Compressed sensing: [Bayati, Lelarge, Montanari 2013] Deshpande, Montanari Planted Cliques November 5, / 49

110 To conclude... Message-passing algorithm performs weighted counts of non-reversing trees Such structures have been used elsewhere: Clustering sparse networks: [Krzakala et al. 2013] Compressed sensing: [Bayati, Lelarge, Montanari 2013] Deshpande, Montanari Planted Cliques November 5, / 49

111 To conclude... What is the dense analogue for local algorithms? What about other structural properties? Thank you! Deshpande, Montanari Planted Cliques November 5, / 49

112 To conclude... What is the dense analogue for local algorithms? What about other structural properties? Thank you! Deshpande, Montanari Planted Cliques November 5, / 49

113 To conclude... What is the dense analogue for local algorithms? What about other structural properties? Thank you! Deshpande, Montanari Planted Cliques November 5, / 49

Reconstruction in the Generalized Stochastic Block Model

Reconstruction in the Generalized Stochastic Block Model Reconstruction in the Generalized Stochastic Block Model Marc Lelarge 1 Laurent Massoulié 2 Jiaming Xu 3 1 INRIA-ENS 2 INRIA-Microsoft Research Joint Centre 3 University of Illinois, Urbana-Champaign GDR

More information

OPTIMAL DETECTION OF SPARSE PRINCIPAL COMPONENTS. Philippe Rigollet (joint with Quentin Berthet)

OPTIMAL DETECTION OF SPARSE PRINCIPAL COMPONENTS. Philippe Rigollet (joint with Quentin Berthet) OPTIMAL DETECTION OF SPARSE PRINCIPAL COMPONENTS Philippe Rigollet (joint with Quentin Berthet) High dimensional data Cloud of point in R p High dimensional data Cloud of point in R p High dimensional

More information

Robust Principal Component Analysis

Robust Principal Component Analysis ELE 538B: Mathematics of High-Dimensional Data Robust Principal Component Analysis Yuxin Chen Princeton University, Fall 2018 Disentangling sparse and low-rank matrices Suppose we are given a matrix M

More information

Statistical and Computational Phase Transitions in Planted Models

Statistical and Computational Phase Transitions in Planted Models Statistical and Computational Phase Transitions in Planted Models Jiaming Xu Joint work with Yudong Chen (UC Berkeley) Acknowledgement: Prof. Bruce Hajek November 4, 203 Cluster/Community structure in

More information

Computational Lower Bounds for Community Detection on Random Graphs

Computational Lower Bounds for Community Detection on Random Graphs Computational Lower Bounds for Community Detection on Random Graphs Bruce Hajek, Yihong Wu, Jiaming Xu Department of Electrical and Computer Engineering Coordinated Science Laboratory University of Illinois

More information

Risk and Noise Estimation in High Dimensional Statistics via State Evolution

Risk and Noise Estimation in High Dimensional Statistics via State Evolution Risk and Noise Estimation in High Dimensional Statistics via State Evolution Mohsen Bayati Stanford University Joint work with Jose Bento, Murat Erdogdu, Marc Lelarge, and Andrea Montanari Statistical

More information

arxiv: v1 [cond-mat.dis-nn] 16 Feb 2018

arxiv: v1 [cond-mat.dis-nn] 16 Feb 2018 Parallel Tempering for the planted clique problem Angelini Maria Chiara 1 1 Dipartimento di Fisica, Ed. Marconi, Sapienza Università di Roma, P.le A. Moro 2, 00185 Roma Italy arxiv:1802.05903v1 [cond-mat.dis-nn]

More information

Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems

Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems Yash Deshpande and Andrea Montanari February 3, 015 Abstract Given a large data matrix A R n n, we consider the problem

More information

Finding Hidden Cliques of Size N/e in Nearly Linear Time

Finding Hidden Cliques of Size N/e in Nearly Linear Time Finding Hidden Cliques of Size /e in early Linear Time Yash Deshpande and Andrea Montanari April 25, 2013 Abstract Consider an Erdös-Renyi random graph in which each edge is present independently with

More information

Lectures on Combinatorial Statistics. Gábor Lugosi

Lectures on Combinatorial Statistics. Gábor Lugosi Lectures on Combinatorial Statistics Gábor Lugosi July 12, 2017 Contents 1 The hidden clique problem 4 1.1 A hypothesis testing problem...................... 4 1.2 The clique number of the Erdős-Rényi

More information

Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems

Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems JMLR: Workshop and Conference Proceedings vol 40:1 40, 015 8th Annual Conference on Learning Theory Improved Sum-of-Squares Lower Bounds for Hidden Clique and Hidden Submatrix Problems Yash Deshpande Andrea

More information

Sparse Superposition Codes for the Gaussian Channel

Sparse Superposition Codes for the Gaussian Channel Sparse Superposition Codes for the Gaussian Channel Florent Krzakala (LPS, Ecole Normale Supérieure, France) J. Barbier (ENS) arxiv:1403.8024 presented at ISIT 14 Long version in preparation Communication

More information

A sequence of triangle-free pseudorandom graphs

A sequence of triangle-free pseudorandom graphs A sequence of triangle-free pseudorandom graphs David Conlon Abstract A construction of Alon yields a sequence of highly pseudorandom triangle-free graphs with edge density significantly higher than one

More information

Community detection in stochastic block models via spectral methods

Community detection in stochastic block models via spectral methods Community detection in stochastic block models via spectral methods Laurent Massoulié (MSR-Inria Joint Centre, Inria) based on joint works with: Dan Tomozei (EPFL), Marc Lelarge (Inria), Jiaming Xu (UIUC),

More information

The non-backtracking operator

The non-backtracking operator The non-backtracking operator Florent Krzakala LPS, Ecole Normale Supérieure in collaboration with Paris: L. Zdeborova, A. Saade Rome: A. Decelle Würzburg: J. Reichardt Santa Fe: C. Moore, P. Zhang Berkeley:

More information

Information Recovery from Pairwise Measurements

Information Recovery from Pairwise Measurements Information Recovery from Pairwise Measurements A Shannon-Theoretic Approach Yuxin Chen, Changho Suh, Andrea Goldsmith Stanford University KAIST Page 1 Recovering data from correlation measurements A large

More information

Statistical-Computational Phase Transitions in Planted Models: The High-Dimensional Setting

Statistical-Computational Phase Transitions in Planted Models: The High-Dimensional Setting : The High-Dimensional Setting Yudong Chen YUDONGCHEN@EECSBERELEYEDU Department of EECS, University of California, Berkeley, Berkeley, CA 94704, USA Jiaming Xu JXU18@ILLINOISEDU Department of ECE, University

More information

Information-theoretically Optimal Sparse PCA

Information-theoretically Optimal Sparse PCA Information-theoretically Optimal Sparse PCA Yash Deshpande Department of Electrical Engineering Stanford, CA. Andrea Montanari Departments of Electrical Engineering and Statistics Stanford, CA. Abstract

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity

Spectral Graph Theory and its Applications. Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Spectral Graph Theory and its Applications Daniel A. Spielman Dept. of Computer Science Program in Applied Mathematics Yale Unviersity Outline Adjacency matrix and Laplacian Intuition, spectral graph drawing

More information

Solving linear systems (6 lectures)

Solving linear systems (6 lectures) Chapter 2 Solving linear systems (6 lectures) 2.1 Solving linear systems: LU factorization (1 lectures) Reference: [Trefethen, Bau III] Lecture 20, 21 How do you solve Ax = b? (2.1.1) In numerical linear

More information

Matrix Completion from a Few Entries

Matrix Completion from a Few Entries Matrix Completion from a Few Entries Raghunandan H. Keshavan and Sewoong Oh EE Department Stanford University, Stanford, CA 9434 Andrea Montanari EE and Statistics Departments Stanford University, Stanford,

More information

Statistical-Computational Tradeoffs in Planted Problems and Submatrix Localization with a Growing Number of Clusters and Submatrices

Statistical-Computational Tradeoffs in Planted Problems and Submatrix Localization with a Growing Number of Clusters and Submatrices Journal of Machine Learning Research 17 (2016) 1-57 Submitted 8/14; Revised 5/15; Published 4/16 Statistical-Computational Tradeoffs in Planted Problems and Submatrix Localization with a Growing Number

More information

Sensing systems limited by constraints: physical size, time, cost, energy

Sensing systems limited by constraints: physical size, time, cost, energy Rebecca Willett Sensing systems limited by constraints: physical size, time, cost, energy Reduce the number of measurements needed for reconstruction Higher accuracy data subject to constraints Original

More information

Message Passing Algorithms: A Success Looking for Theoreticians

Message Passing Algorithms: A Success Looking for Theoreticians Message Passing Algorithms: A Success Looking for Theoreticians Andrea Montanari Stanford University June 5, 2010 Andrea Montanari (Stanford) Message Passing June 5, 2010 1 / 93 What is this talk about?

More information

A NEARLY-TIGHT SUM-OF-SQUARES LOWER BOUND FOR PLANTED CLIQUE

A NEARLY-TIGHT SUM-OF-SQUARES LOWER BOUND FOR PLANTED CLIQUE A NEARLY-TIGHT SUM-OF-SQUARES LOWER BOUND FOR PLANTED CLIQUE Boaz Barak Sam Hopkins Jon Kelner Pravesh Kothari Ankur Moitra Aaron Potechin on the market! PLANTED CLIQUE ("DISTINGUISHING") Input: graph

More information

Graph Partitioning Using Random Walks

Graph Partitioning Using Random Walks Graph Partitioning Using Random Walks A Convex Optimization Perspective Lorenzo Orecchia Computer Science Why Spectral Algorithms for Graph Problems in practice? Simple to implement Can exploit very efficient

More information

Overlapping Variable Clustering with Statistical Guarantees and LOVE

Overlapping Variable Clustering with Statistical Guarantees and LOVE with Statistical Guarantees and LOVE Department of Statistical Science Cornell University WHOA-PSI, St. Louis, August 2017 Joint work with Mike Bing, Yang Ning and Marten Wegkamp Cornell University, Department

More information

Chapter 2 Spectra of Finite Graphs

Chapter 2 Spectra of Finite Graphs Chapter 2 Spectra of Finite Graphs 2.1 Characteristic Polynomials Let G = (V, E) be a finite graph on n = V vertices. Numbering the vertices, we write down its adjacency matrix in an explicit form of n

More information

Message passing and approximate message passing

Message passing and approximate message passing Message passing and approximate message passing Arian Maleki Columbia University 1 / 47 What is the problem? Given pdf µ(x 1, x 2,..., x n ) we are interested in arg maxx1,x 2,...,x n µ(x 1, x 2,..., x

More information

CS369N: Beyond Worst-Case Analysis Lecture #4: Probabilistic and Semirandom Models for Clustering and Graph Partitioning

CS369N: Beyond Worst-Case Analysis Lecture #4: Probabilistic and Semirandom Models for Clustering and Graph Partitioning CS369N: Beyond Worst-Case Analysis Lecture #4: Probabilistic and Semirandom Models for Clustering and Graph Partitioning Tim Roughgarden April 25, 2010 1 Learning Mixtures of Gaussians This lecture continues

More information

Solution Recovery via L1 minimization: What are possible and Why?

Solution Recovery via L1 minimization: What are possible and Why? Solution Recovery via L1 minimization: What are possible and Why? Yin Zhang Department of Computational and Applied Mathematics Rice University, Houston, Texas, U.S.A. Eighth US-Mexico Workshop on Optimization

More information

Non-convex Robust PCA: Provable Bounds

Non-convex Robust PCA: Provable Bounds Non-convex Robust PCA: Provable Bounds Anima Anandkumar U.C. Irvine Joint work with Praneeth Netrapalli, U.N. Niranjan, Prateek Jain and Sujay Sanghavi. Learning with Big Data High Dimensional Regime Missing

More information

An algebraic perspective on integer sparse recovery

An algebraic perspective on integer sparse recovery An algebraic perspective on integer sparse recovery Lenny Fukshansky Claremont McKenna College (joint work with Deanna Needell and Benny Sudakov) Combinatorics Seminar USC October 31, 2018 From Wikipedia:

More information

On Finding Planted Cliques and Solving Random Linear Equa9ons. Math Colloquium. Lev Reyzin UIC

On Finding Planted Cliques and Solving Random Linear Equa9ons. Math Colloquium. Lev Reyzin UIC On Finding Planted Cliques and Solving Random Linear Equa9ons Math Colloquium Lev Reyzin UIC 1 random graphs 2 Erdős- Rényi Random Graphs G(n,p) generates graph G on n ver9ces by including each possible

More information

LIMITATION OF LEARNING RANKINGS FROM PARTIAL INFORMATION. By Srikanth Jagabathula Devavrat Shah

LIMITATION OF LEARNING RANKINGS FROM PARTIAL INFORMATION. By Srikanth Jagabathula Devavrat Shah 00 AIM Workshop on Ranking LIMITATION OF LEARNING RANKINGS FROM PARTIAL INFORMATION By Srikanth Jagabathula Devavrat Shah Interest is in recovering distribution over the space of permutations over n elements

More information

Hierarchies. 1. Lovasz-Schrijver (LS), LS+ 2. Sherali Adams 3. Lasserre 4. Mixed Hierarchy (recently used) Idea: P = conv(subset S of 0,1 n )

Hierarchies. 1. Lovasz-Schrijver (LS), LS+ 2. Sherali Adams 3. Lasserre 4. Mixed Hierarchy (recently used) Idea: P = conv(subset S of 0,1 n ) Hierarchies Today 1. Some more familiarity with Hierarchies 2. Examples of some basic upper and lower bounds 3. Survey of recent results (possible material for future talks) Hierarchies 1. Lovasz-Schrijver

More information

Combining geometry and combinatorics

Combining geometry and combinatorics Combining geometry and combinatorics A unified approach to sparse signal recovery Anna C. Gilbert University of Michigan joint work with R. Berinde (MIT), P. Indyk (MIT), H. Karloff (AT&T), M. Strauss

More information

An Introduction to Sparse Approximation

An Introduction to Sparse Approximation An Introduction to Sparse Approximation Anna C. Gilbert Department of Mathematics University of Michigan Basic image/signal/data compression: transform coding Approximate signals sparsely Compress images,

More information

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing

Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Sparse Interactions: Identifying High-Dimensional Multilinear Systems via Compressed Sensing Bobak Nazer and Robert D. Nowak University of Wisconsin, Madison Allerton 10/01/10 Motivation: Virus-Host Interaction

More information

MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing

MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing MAT 585: Johnson-Lindenstrauss, Group testing, and Compressed Sensing Afonso S. Bandeira April 9, 2015 1 The Johnson-Lindenstrauss Lemma Suppose one has n points, X = {x 1,..., x n }, in R d with d very

More information

Spectral Partitiong in a Stochastic Block Model

Spectral Partitiong in a Stochastic Block Model Spectral Graph Theory Lecture 21 Spectral Partitiong in a Stochastic Block Model Daniel A. Spielman November 16, 2015 Disclaimer These notes are not necessarily an accurate representation of what happened

More information

Subset selection with sparse matrices

Subset selection with sparse matrices Subset selection with sparse matrices Alberto Del Pia, University of Wisconsin-Madison Santanu S. Dey, Georgia Tech Robert Weismantel, ETH Zürich February 1, 018 Schloss Dagstuhl Subset selection for regression

More information

Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery

Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Sparse analysis Lecture VII: Combining geometry and combinatorics, sparse matrices for sparse signal recovery Anna C. Gilbert Department of Mathematics University of Michigan Sparse signal recovery measurements:

More information

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries

Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Adaptive Compressive Imaging Using Sparse Hierarchical Learned Dictionaries Jarvis Haupt University of Minnesota Department of Electrical and Computer Engineering Supported by Motivation New Agile Sensing

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector

More information

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit

New Coherence and RIP Analysis for Weak. Orthogonal Matching Pursuit New Coherence and RIP Analysis for Wea 1 Orthogonal Matching Pursuit Mingrui Yang, Member, IEEE, and Fran de Hoog arxiv:1405.3354v1 [cs.it] 14 May 2014 Abstract In this paper we define a new coherence

More information

Multipath Matching Pursuit

Multipath Matching Pursuit Multipath Matching Pursuit Submitted to IEEE trans. on Information theory Authors: S. Kwon, J. Wang, and B. Shim Presenter: Hwanchol Jang Multipath is investigated rather than a single path for a greedy

More information

Exact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations

Exact Topology Identification of Large-Scale Interconnected Dynamical Systems from Compressive Observations Exact Topology Identification of arge-scale Interconnected Dynamical Systems from Compressive Observations Borhan M Sanandaji, Tyrone Vincent, and Michael B Wakin Abstract In this paper, we consider the

More information

On convergence of Approximate Message Passing

On convergence of Approximate Message Passing On convergence of Approximate Message Passing Francesco Caltagirone (1), Florent Krzakala (2) and Lenka Zdeborova (1) (1) Institut de Physique Théorique, CEA Saclay (2) LPS, Ecole Normale Supérieure, Paris

More information

Compressive Sensing with Random Matrices

Compressive Sensing with Random Matrices Compressive Sensing with Random Matrices Lucas Connell University of Georgia 9 November 017 Lucas Connell (University of Georgia) Compressive Sensing with Random Matrices 9 November 017 1 / 18 Overview

More information

Sparse Subspace Clustering

Sparse Subspace Clustering Sparse Subspace Clustering Based on Sparse Subspace Clustering: Algorithm, Theory, and Applications by Elhamifar and Vidal (2013) Alex Gutierrez CSCI 8314 March 2, 2017 Outline 1 Motivation and Background

More information

Lift-and-Project Techniques and SDP Hierarchies

Lift-and-Project Techniques and SDP Hierarchies MFO seminar on Semidefinite Programming May 30, 2010 Typical combinatorial optimization problem: max c T x s.t. Ax b, x {0, 1} n P := {x R n Ax b} P I := conv(k {0, 1} n ) LP relaxation Integral polytope

More information

Randomness-in-Structured Ensembles for Compressed Sensing of Images

Randomness-in-Structured Ensembles for Compressed Sensing of Images Randomness-in-Structured Ensembles for Compressed Sensing of Images Abdolreza Abdolhosseini Moghadam Dep. of Electrical and Computer Engineering Michigan State University Email: abdolhos@msu.edu Hayder

More information

Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines. Eric W. Tramel. itwist 2016 Aalborg, DK 24 August 2016

Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines. Eric W. Tramel. itwist 2016 Aalborg, DK 24 August 2016 Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines Eric W. Tramel itwist 2016 Aalborg, DK 24 August 2016 Andre MANOEL, Francesco CALTAGIRONE, Marylou GABRIE, Florent

More information

Mismatched Estimation in Large Linear Systems

Mismatched Estimation in Large Linear Systems Mismatched Estimation in Large Linear Systems Yanting Ma, Dror Baron, and Ahmad Beirami North Carolina State University Massachusetts Institute of Technology & Duke University Supported by NSF & ARO Motivation

More information

On Finding Planted Cliques in Random Graphs

On Finding Planted Cliques in Random Graphs On Finding Planted Cliques in Random Graphs GraphEx 2014 Lev Reyzin UIC joint with Feldman, Grigorescu, Vempala, Xiao in STOC 13 1 Erdős- Rényi Random Graphs G(n,p) generates graph G on n vertces by including

More information

MINI-TUTORIAL ON SEMI-ALGEBRAIC PROOF SYSTEMS. Albert Atserias Universitat Politècnica de Catalunya Barcelona

MINI-TUTORIAL ON SEMI-ALGEBRAIC PROOF SYSTEMS. Albert Atserias Universitat Politècnica de Catalunya Barcelona MINI-TUTORIAL ON SEMI-ALGEBRAIC PROOF SYSTEMS Albert Atserias Universitat Politècnica de Catalunya Barcelona Part I CONVEX POLYTOPES Convex polytopes as linear inequalities Polytope: P = {x R n : Ax b}

More information

Average-case Analysis for Combinatorial Problems,

Average-case Analysis for Combinatorial Problems, Average-case Analysis for Combinatorial Problems, with s and Stochastic Spanning Trees Mathematical Sciences, Carnegie Mellon University February 2, 2006 Outline Introduction 1 Introduction Combinatorial

More information

Local MAX-CUT in Smoothed Polynomial Time

Local MAX-CUT in Smoothed Polynomial Time Local MAX-CUT in Smoothed Polynomial Time Omer Angel 1 Sebastien Bubeck 2 Yuval Peres 2 Fan Wei 3 1 University of British Columbia 2 Microsoft Research Redmond 3 Stanford University November 9, 2016 Introduction

More information

The convex algebraic geometry of rank minimization

The convex algebraic geometry of rank minimization The convex algebraic geometry of rank minimization Pablo A. Parrilo Laboratory for Information and Decision Systems Massachusetts Institute of Technology International Symposium on Mathematical Programming

More information

Solving Random Satisfiable 3CNF Formulas in Expected Polynomial Time

Solving Random Satisfiable 3CNF Formulas in Expected Polynomial Time Solving Random Satisfiable 3CNF Formulas in Expected Polynomial Time Michael Krivelevich and Dan Vilenchik Tel-Aviv University Solving Random Satisfiable 3CNF Formulas in Expected Polynomial Time p. 1/2

More information

Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach

Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Detecting Sparse Structures in Data in Sub-Linear Time: A group testing approach Boaz Nadler The Weizmann Institute of Science Israel Joint works with Inbal Horev, Ronen Basri, Meirav Galun and Ery Arias-Castro

More information

Optimization-based sparse recovery: Compressed sensing vs. super-resolution

Optimization-based sparse recovery: Compressed sensing vs. super-resolution Optimization-based sparse recovery: Compressed sensing vs. super-resolution Carlos Fernandez-Granda, Google Computational Photography and Intelligent Cameras, IPAM 2/5/2014 This work was supported by a

More information

A new method on deterministic construction of the measurement matrix in compressed sensing

A new method on deterministic construction of the measurement matrix in compressed sensing A new method on deterministic construction of the measurement matrix in compressed sensing Qun Mo 1 arxiv:1503.01250v1 [cs.it] 4 Mar 2015 Abstract Construction on the measurement matrix A is a central

More information

L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise

L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise L-statistics based Modification of Reconstruction Algorithms for Compressive Sensing in the Presence of Impulse Noise Srdjan Stanković, Irena Orović and Moeness Amin 1 Abstract- A modification of standard

More information

Recoverabilty Conditions for Rankings Under Partial Information

Recoverabilty Conditions for Rankings Under Partial Information Recoverabilty Conditions for Rankings Under Partial Information Srikanth Jagabathula Devavrat Shah Abstract We consider the problem of exact recovery of a function, defined on the space of permutations

More information

Tractable Upper Bounds on the Restricted Isometry Constant

Tractable Upper Bounds on the Restricted Isometry Constant Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.

More information

Bayesian Learning in Undirected Graphical Models

Bayesian Learning in Undirected Graphical Models Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul

More information

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles

Compressed sensing. Or: the equation Ax = b, revisited. Terence Tao. Mahler Lecture Series. University of California, Los Angeles Or: the equation Ax = b, revisited University of California, Los Angeles Mahler Lecture Series Acquiring signals Many types of real-world signals (e.g. sound, images, video) can be viewed as an n-dimensional

More information

Learning latent structure in complex networks

Learning latent structure in complex networks Learning latent structure in complex networks Lars Kai Hansen www.imm.dtu.dk/~lkh Current network research issues: Social Media Neuroinformatics Machine learning Joint work with Morten Mørup, Sune Lehmann

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Instructor: Moritz Hardt Email: hardt+ee227c@berkeley.edu Graduate Instructor: Max Simchowitz Email: msimchow+ee227c@berkeley.edu

More information

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees

Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees Lecture: Introduction to Compressed Sensing Sparse Recovery Guarantees http://bicmr.pku.edu.cn/~wenzw/bigdata2018.html Acknowledgement: this slides is based on Prof. Emmanuel Candes and Prof. Wotao Yin

More information

Lecture 3. 1 Polynomial-time algorithms for the maximum flow problem

Lecture 3. 1 Polynomial-time algorithms for the maximum flow problem ORIE 633 Network Flows August 30, 2007 Lecturer: David P. Williamson Lecture 3 Scribe: Gema Plaza-Martínez 1 Polynomial-time algorithms for the maximum flow problem 1.1 Introduction Let s turn now to considering

More information

Approximate Message Passing

Approximate Message Passing Approximate Message Passing Mohammad Emtiyaz Khan CS, UBC February 8, 2012 Abstract In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki s PhD thesis. 1 Notation We denote scalars by small letters

More information

BOUNDING NORMS OF LOCALLY RANDOM MATRICES. Dhruv Medarametla

BOUNDING NORMS OF LOCALLY RANDOM MATRICES. Dhruv Medarametla BOUNDING NORMS OF LOCALLY RANDOM MATRICES Dhruv Medarametla Abstract. Recently, several papers proving lower bounds for the performance of the Sum Of Squares Hierarchy on the planted clique problem have

More information

Reconstruction for Models on Random Graphs

Reconstruction for Models on Random Graphs 1 Stanford University, 2 ENS Paris October 21, 2007 Outline 1 The Reconstruction Problem 2 Related work and applications 3 Results The Reconstruction Problem: A Story Alice and Bob Alice, Bob and G root

More information

Graph Detection and Estimation Theory

Graph Detection and Estimation Theory Introduction Detection Estimation Graph Detection and Estimation Theory (and algorithms, and applications) Patrick J. Wolfe Statistics and Information Sciences Laboratory (SISL) School of Engineering and

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER 2011 7255 On the Performance of Sparse Recovery Via `p-minimization (0 p 1) Meng Wang, Student Member, IEEE, Weiyu Xu, and Ao Tang, Senior

More information

Linear Algebra Practice Final

Linear Algebra Practice Final . Let (a) First, Linear Algebra Practice Final Summer 3 3 A = 5 3 3 rref([a ) = 5 so if we let x 5 = t, then x 4 = t, x 3 =, x = t, and x = t, so that t t x = t = t t whence ker A = span(,,,, ) and a basis

More information

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis

ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis ECE 8201: Low-dimensional Signal Models for High-dimensional Data Analysis Lecture 7: Matrix completion Yuejie Chi The Ohio State University Page 1 Reference Guaranteed Minimum-Rank Solutions of Linear

More information

Ma/CS 6b Class 20: Spectral Graph Theory

Ma/CS 6b Class 20: Spectral Graph Theory Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Eigenvalues and Eigenvectors A an n n matrix of real numbers. The eigenvalues of A are the numbers λ such that Ax = λx for some nonzero vector x

More information

Linear Sketches A Useful Tool in Streaming and Compressive Sensing

Linear Sketches A Useful Tool in Streaming and Compressive Sensing Linear Sketches A Useful Tool in Streaming and Compressive Sensing Qin Zhang 1-1 Linear sketch Random linear projection M : R n R k that preserves properties of any v R n with high prob. where k n. M =

More information

Approximate Message Passing for Bilinear Models

Approximate Message Passing for Bilinear Models Approximate Message Passing for Bilinear Models Volkan Cevher Laboratory for Informa4on and Inference Systems LIONS / EPFL h,p://lions.epfl.ch & Idiap Research Ins=tute joint work with Mitra Fatemi @Idiap

More information

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011

Introduction How it works Theory behind Compressed Sensing. Compressed Sensing. Huichao Xue. CS3750 Fall 2011 Compressed Sensing Huichao Xue CS3750 Fall 2011 Table of Contents Introduction From News Reports Abstract Definition How it works A review of L 1 norm The Algorithm Backgrounds for underdetermined linear

More information

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation

Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation UIUC CSL Mar. 24 Combining Sparsity with Physically-Meaningful Constraints in Sparse Parameter Estimation Yuejie Chi Department of ECE and BMI Ohio State University Joint work with Yuxin Chen (Stanford).

More information

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES Lenka Zdeborová (CEA Saclay and CNRS, France) MAIN CONTRIBUTORS TO THE PHYSICS UNDERSTANDING OF RANDOM INSTANCES Braunstein, Franz, Kabashima, Kirkpatrick,

More information

Introduction to Compressed Sensing

Introduction to Compressed Sensing Introduction to Compressed Sensing Alejandro Parada, Gonzalo Arce University of Delaware August 25, 2016 Motivation: Classical Sampling 1 Motivation: Classical Sampling Issues Some applications Radar Spectral

More information

Noisy Signal Recovery via Iterative Reweighted L1-Minimization

Noisy Signal Recovery via Iterative Reweighted L1-Minimization Noisy Signal Recovery via Iterative Reweighted L1-Minimization Deanna Needell UC Davis / Stanford University Asilomar SSC, November 2009 Problem Background Setup 1 Suppose x is an unknown signal in R d.

More information

11 The Max-Product Algorithm

11 The Max-Product Algorithm Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms for Inference Fall 2014 11 The Max-Product Algorithm In the previous lecture, we introduced

More information

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016

MTH 5102 Linear Algebra Practice Final Exam April 26, 2016 Name (Last name, First name): MTH 5 Linear Algebra Practice Final Exam April 6, 6 Exam Instructions: You have hours to complete the exam. There are a total of 9 problems. You must show your work and write

More information

arxiv: v2 [math.st] 21 Oct 2015

arxiv: v2 [math.st] 21 Oct 2015 Computational and Statistical Boundaries for Submatrix Localization in a Large Noisy Matrix T. Tony Cai, Tengyuan Liang and Alexander Rakhlin arxiv:1502.01988v2 [math.st] 21 Oct 2015 Department of Statistics

More information

Sparse and Robust Optimization and Applications

Sparse and Robust Optimization and Applications Sparse and and Statistical Learning Workshop Les Houches, 2013 Robust Laurent El Ghaoui with Mert Pilanci, Anh Pham EECS Dept., UC Berkeley January 7, 2013 1 / 36 Outline Sparse Sparse Sparse Probability

More information

Signal Recovery from Permuted Observations

Signal Recovery from Permuted Observations EE381V Course Project Signal Recovery from Permuted Observations 1 Problem Shanshan Wu (sw33323) May 8th, 2015 We start with the following problem: let s R n be an unknown n-dimensional real-valued signal,

More information

Compressive Sensing (CS)

Compressive Sensing (CS) Compressive Sensing (CS) Luminita Vese & Ming Yan lvese@math.ucla.edu yanm@math.ucla.edu Department of Mathematics University of California, Los Angeles The UCLA Advanced Neuroimaging Summer Program (2014)

More information

Consistency Under Sampling of Exponential Random Graph Models

Consistency Under Sampling of Exponential Random Graph Models Consistency Under Sampling of Exponential Random Graph Models Cosma Shalizi and Alessandro Rinaldo Summary by: Elly Kaizar Remember ERGMs (Exponential Random Graph Models) Exponential family models Sufficient

More information

ACO Comprehensive Exam October 14 and 15, 2013

ACO Comprehensive Exam October 14 and 15, 2013 1. Computability, Complexity and Algorithms (a) Let G be the complete graph on n vertices, and let c : V (G) V (G) [0, ) be a symmetric cost function. Consider the following closest point heuristic for

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)

AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 19: Computing the SVD; Sparse Linear Systems Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical

More information

Combinatorial Optimization

Combinatorial Optimization Combinatorial Optimization Problem set 8: solutions 1. Fix constants a R and b > 1. For n N, let f(n) = n a and g(n) = b n. Prove that f(n) = o ( g(n) ). Solution. First we observe that g(n) 0 for all

More information

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5

CS 229r: Algorithms for Big Data Fall Lecture 19 Nov 5 CS 229r: Algorithms for Big Data Fall 215 Prof. Jelani Nelson Lecture 19 Nov 5 Scribe: Abdul Wasay 1 Overview In the last lecture, we started discussing the problem of compressed sensing where we are given

More information