The spin-glass cornucopia

Size: px
Start display at page:

Download "The spin-glass cornucopia"

Transcription

1 The spin-glass cornucopia Marc Mézard! Ecole normale supérieure - PSL Research University and CNRS, Université Paris Sud LPT-ENS, January 2015

2 40 years of research

3 n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys

4 n n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys 1975: Edwards and Anderson define a model, an order parameter, a new phase of matter

5 n n n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys 1975: Edwards and Anderson define a model, an order parameter, a new phase of matter : Theory develops along two lines: mean field theory (Sherrington-Kirkpatrick) and «real space» droplets

6 n n n n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys 1975: Edwards and Anderson define a model, an order parameter, a new phase of matter : Theory develops along two lines: mean field theory (Sherrington-Kirkpatrick) and «real space» droplets : Solution of SK model by G. Parisi with replicas. Physical understanding (ultrametricity), cavity method (MPV) (MP)

7 Where do we stand? Real spin glasses: still an open problem A very sophisticated and powerful corpus of conceptual and methodological approaches has been developed, and used in many different fields Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam

8 The cornucopia Statistical physics (glasses, disordered polymers ) Computer Science problems: satisfiability, coloring) Information Theory (error correction, inference Signal acquisition and processing - big data Significant information Gene expression networks Neural networks Interacting agents on a market

9 Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say)

10 Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say) Gradually : add dirt! Defects, dislocations, pinning sites, mixtures, walls, etc.

11 Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say) Gradually : add dirt! Defects, dislocations, pinning sites, mixtures, walls, etc. Last forty years : handle strongly disordered systems. Each atom has a different environment. Even the nature of the basic phases is hard to understand. Heterogeneous «agents»

12 Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say) Gradually : add dirt! Defects, dislocations, pinning sites, mixtures, walls, etc. Last forty years : handle strongly disordered systems. Each atom has a different environment. Even the nature of the basic phases is hard to understand. Heterogeneous «agents»

13 n n n Table of content Spin glasses dynamics, mean field description Information theory: phase transitions and glass phases in General setup: link to hardness) n Bird s eye view : compressed sensing

14 Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: J ij = J>0 At low T: spins align, P concentrates on

15 Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: J ij = J>0 Phases : hs i i = M «representative agent»

16 Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: J ij = J>0 Phases : hs i i = M «representative agent» 0 1 M = X j J ij s j A ' tanh (zjm)

17 Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: Phases : hs i i = M «representative agent» 0 J ij = J>0 1 M T Two states M = X j J ij s j A ' tanh (zjm)

18 CuMn i J ij j Spin glasses s i {±1} E = J ij s i s j ij P (s 1,...,s N )= 1 Z e E/T Spin glass: J ij sign depends on ij frustration What is the behavior

19 Spin Glasses Linear response to a small magnetic field: i J ij j χ (a.u.) T=12 K New dynamics Memory T=10 K t t 1 3 T=12 K t t t E. Vincent et al, SPEC time (min)

20 Mean-field lessons Energy Energy Magnetization 1 N σ i s i Configurations Ising Spin glass 1- Glass «phase» : Many pure states, unrelated by symmetry, organized in a hierarchical «ultrametric» structure 2- Many metastable states, unrelated by symmetry 3- «True» ground state : fragile!

21 Take-home Glass phases Proliferation of states (exponentially many) Hierarchical structures Various time scales : aging, modification of FDT Technically : powerful methods - Replicas - Dynamical field theories - Cavity method self-consistent equations for m i, magnetization of spin i in state and statistical analysis

22 An example from Information Theory Claude Shannon ( )

23 An example from Information Theory Claude Shannon ( ) Talk: S ~ Log N 2 or: S ~ N

24 An example from Information Theory More recent convergence: Error correcting codes, pool testing, compressed sensing Claude Shannon ( ) Talk: S ~ Log N 2 or: S ~ N

25 An example from Information Theory More recent convergence: Error correcting codes, pool testing, compressed sensing The new frontier : infinitely complex information Claude Shannon ( ) Talk: S ~ Log N 2 or: S ~ N

26 Information transmission and error correction

27

28 Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N

29 Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N e.g. repetition rate =1/3

30 Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N e.g. repetition rate =1/ Code Channel Majority decoding error probability p 3 +3p 2 (1 p) 3p 2

31 Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N Shannon s theorem: for a given noise level p, one can build a coder/decoder which transmits with r<r c (p)

32 Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N Shannon s theorem: for a given noise level p, one can build a coder/decoder which transmits with Two ingredients: - «Thermodynamic limit» N,L!1 - Ensemble of Random Codes ( Random Energy Model of spin glasses) r<r c (p)

33 Unit hypercube in dimensions N Shannon code ensemble codewords (random) sent codeword

34 Unit hypercube in dimensions N Shannon code ensemble codewords (random) sent codeword received word

35 Probability of perfect decoding: Phase transitions in decoding Decoding = find closest codeword 1 0 p = noise Shannon geometric phase transition = REM phase transition

36 Unit hypercube in dimensions N Shannon code ensemble codewords (random) sent codeword received word Perfect codes in principle, but impossible to decode in practice (structureless time ) O(e N )

37 Efficient codes : parity checks Shannon s code is useless (time O(e N ) ) Add redundancy, with structure allowing to decode a b c x 1 + x 4 + x 5 + x 7 = 0 (mod 2) x 2 + x 4 + x 6 + x 7 = 0 (mod 2) x 3 + x 5 + x 6 + x 7 = 0 (mod 2) codewords among words

38 Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable N = 20,M = 10,R=1/2,L=3,K =6

39 Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable

40 Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable Flipped bits

41 Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable violated parity constraints Flipped bits

42 Decoding =iteration of mean-field «cavity» equations. «Message passing» algorithm - «Belief propagation» Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable violated parity constraints Flipped bits

43 Unit hypercube in dimensions N Error correction: decoding codewords sent codeword

44 Unit hypercube in dimensions N Error correction: decoding codewords sent codeword received word

45 Unit hypercube in dimensions N Error correction: decoding codewords sent codeword received word Seek a codeword at distance from received word pn

46 Unit hypercube in dimensions N Error correction: decoding codewords sent codeword received word metastable states Seek a codeword at distance from received word pn

47 Phase transitions in decoding Probability of perfect decoding: 1 0 p = noise Algo Max Prob Shannon

48 Phase transitions in decoding Probability of perfect decoding: 1 0 p = noise Algo Max Prob Shannon

49 Phase transitions in decoding Probability of perfect decoding: 1 Metastables states=traps 0 p = noise Algo Max Prob Shannon Glass phase

50 Phase transitions in decoding Probability of perfect decoding: 1 0 p = noise Algo Max Prob Shannon

51 Phase transitions in decoding Probability of perfect decoding: 1 0 Ex: 3-6 code on BSC p = noise Algo Max Prob Shannon p d =.084 p c =.101 p S =.110

52 Parity check code in easy phase, time O(N) 1 Phase transitions in decoding Parity-check code, in its glass phase, time O(e N ) «Ideal» code, time O(e N ) 0 p = noise Algo Max Prob Shannon

53 Parity check code in easy phase, time O(N) 1 Phase transitions in decoding Parity-check code, in its glass phase, time O(e N ) «Ideal» code, time O(e N ) 0 Ex: 3-6 code on BSC p = noise Algo Max Prob Shannon p d =.084 p c =.101 p S =.110

54 Example Repetition code k =3 s encoder t channel Coding Transmission f = 10% r decoder Decoding ŝ p =0.1 D. MacKay

55 Low density parity check code with rate 1/3 Transmission Decoding p =0.1 (1 f) 0 0 f 1 1 (1 f) 0 1 (1 f) f (1 f) 0 1 Errorless

56 Glasses and algorithms Phase transitions and glass phases are ubiquitous, in nature and in algorithms trying to solve constraint satisfaction problems with many variables and constraints. Phase transitions relate to the equilibrium system, reached after infinitely long (i.e. exponential) time Dynamic glass transitions relate to the practical performance of (polynomial time) algorithms Deep links with computer science

57 P versus NP SAT LDPC NP complete 3 Colouring TSP (d) 3SAT 3-d spin glass Hamiltonian cycle LDPC SAT TSP 3SAT 3-d spin glass (d) 3 Colouring Hamiltonian cycle NP NP = P = NP complete 2SAT Eulerian circuit P Assignment 2 colouring 2-d spin glass 2SAT Eulerian circuit Assignment 2 colouring 2-d spin glass Conjectured Possible P= NP P= NP

58 Links between NP-hardness and glass transition? Worst-case vs typical sample. Statistical physics = behavior of a typical large-size sample drawn from an ensemble - Importance of phase transitions - Importance of metastable states - Use of physics ideas to come up with new types of algorithms. Mean-field cavity message passing algorithms are among the most powerful algorithms for a large class of inference and constraint satisfaction problems

59 Example : Phase diagram of satisfiability and coloring SAT (E = 0 ) UNSAT (E >0) state Many states E=0 E>0 Many states E>0 α α = d c α =M/N Structural (geometrical) phase transition. Impacts all known algorithms. Survey propagation = best known algo

60 Significant data and knowledge extraction Images with pixels : Instantaneous patterns of firing neurons :

61 Significant data and knowledge extraction Images with pixels : Instantaneous patterns of firing neurons : Youtube : «How to draw George W Busch in 15 seconds»:

62 Significant data, and knowledge extraction Extract significant information from high-dimensional data : For the scientist, the citizen, the neural network, the computer scientist (machine learning)! NB: JPEG compression : from to

63 Compressed sensing Possible applications:! - Rapid Magnetic Resonance Imaging - Tomography - Electronic microscopy - Fluorescence microscopy - Astronomy - Generalized linear regression - Group Testing - Infer regulatory interactions among many genes using only a limited number of experimental conditions -...

64 Compressed sensing Possible applications:! - Rapid Magnetic Resonance Imaging - Tomography - Electronic microscopy - Fluorescence microscopy - Astronomy - Generalized linear regression - Group Testing - Infer regulatory interactions among many genes using only a limited number of experimental conditions How does the brain work?

65 The simplest problem: getting a signal from some measurement= linear transforms Consider a system of linear measurements Measurements y = 0 y 1.. y M 1 C A y = Fs Signal F = M N matrix s = 0 s 1... s N 1 C A Random F : «random projections» (incoherent with signal) s Pb: Find when M<N and is sparse s

66 Compressed sensing as an optimization problem: the norm approach L 1 Find a N - component vector x such that the M equations y = Fx are satisfied and x is minimal Hopefully: x = s x 0 : number of non-zero components x p = X i x i p Ideally, use x 0. In practice, use x 1 NB: «noisy version» : y = Fx+

67 Phase diagram - phase transitions Gaussian random matrix = M/N Number of measurements per variable Donoho 2006, Donoho Tanner 2005 Kabashima, Wadayama and Tanaka, JSTAT 2009 = R/N Fraction of nonzero variables Find a N - component vector x such that the M equations y = Fx are satisfied and x is minimal

68 Phase diagram - phase transitions Gaussian random matrix Number of measurements per variable = M/N Possible with L 1?? Donoho 2006, Donoho Tanner 2005 Kabashima, Wadayama and Tanaka, JSTAT 2009 Reconstruction impossible = R/N Fraction of nonzero variables Find a N - component vector x such that the M equations y = Fx are satisfied and x is minimal

69 Alternative approach, able to reach the optimal rate = Krzakala Sausset Mézard Sun Zdeborova Probabilistic approach = inference Message passing reconstruction of the signal Careful design of the measurement matrix NB: each of these three ingredients is crucial

70 Step 1: Probabilistic approach to compressed sensing Probability P (x) that the signal is x: 1) x must be compatible with the measurements: 1I) A priori measure on xfavours sparsity Gauss-Bernoulli prior: with probability : with probability 1 x i =0 X i F µi x i = y µ : drawn from Gaussian distribution Theorem: with this measure, the original signal most probable (not obvious!) x = s is the

71 Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal more probable than other configurations. Efficient sampling? Not so easy.!!!!!! i x i = s i Gaussian is infinitely

72 Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal more probable than other configurations. Efficient sampling? Not so easy.!!!!!! i x i = s i Gaussian is infinitely

73 Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal x i = s i is infinitely more probable than other configurations. Efficient sampling? Not so easy.!! constraints! Each constraint involves! all the variables: «long-range»! weak interactions (e.g. Curie! Weiss model for magnets). Mean field is exact* i Gaussian

74 Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal x i = s i is infinitely more probable than other configurations. Efficient sampling? Not so easy.!! «Mean field»: constraints! Each constraint involves belief propagation! all the variables: «long-range» (spin glass mean-field! weak interactions (e.g. Curie equations, TAP)! Weiss model for magnets). Mean field is exact* i Gaussian

75 Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure)

76 Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure)

77 Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure) Closed self-consistent equations relating these order parameters («BP», «TAP», «G-AMP»,...)

78 Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure) Closed self-consistent equations relating these order parameters («BP», «TAP», «G-AMP»,...) Four «messages» sent along each edge i µ ( numbers ) can be simplified to parameters 4NM O(N)

79 Performance of the probabilistic approach + message passing + parameter learning Simulations Analytic study of the large method, cavity method) N limit (replica

80 Analytic study: cavity equations, density evolution, replicas, state evolution Replica method allows to compute the «free entropy» (D) = lim N!1 1 N log P (D) where P (D) is the probability that reconstructed x is at distance from original signal s. D D = 1 N X (x i s i ) 2 Cavity method shows that the order parameters of the BP iteration flow according to the gradient of the replica free entropy («density evolution» eqns) analytic control of the BP equations NB rigorous: Bayati Montanari, Lelarge Montanari i

81 Φ(D) Free entropy log P (D) α = 0.62 α = 0.6 α = 0.58 α = =.4 Number of iterations BP convergence time α=ρ α rbp α L1 rbp Seeded BEP - L=10 Seeded BEP - L= D distance to native state α Dynamic glass transition When is too small, BP is trapped in a glass phase

82 Performance of BP : phase diagram L 1 1 BP α α L1 (ρ) α rbp (ρ) S-BEP α = ρ ρ

83 Step 3: design the measurement matrix in order to get around the glass transition Getting around the glass trap: design the matrix F so that one nucleates the naive state (crystal nucleation idea,...borrowed from error correcting codes!) Felström-Zigangirov, Kudekar Richardson Urbanke, Hassani Macris Urbanke,...! «Seeded BP»: A seed to nucleate the crystal

84 A construction inspired by the spatially coupled matrices developed in coding theory cf: Urbanke et al. Mixed mean-field and one-dimensional system: 1) Create many mean-field sub-systems subsystem 1 subsystem 2 subsystem 3 subsystem 4 subsystem 5

85 Mixed mean-field and one-dimensional system: 2) Add a first neighbor coupling subsystem 1 subsystem 2 subsystem 3 subsystem 4 subsystem 5

86 Mixed mean-field and one-dimensional system: subsystem 1 3) Choose parameters such that the first system is in the region of the phase diagram where there is no metastability subsystem 2 subsystem 3 subsystem 4 subsystem 5 large value of α Seed low values of α On average, α is still low!

87 Mixed mean-field and one-dimensional system: 4) The solution will appear in the first sub-system (with large α), and then propagate in the system subsystem 1 subsystem 2 subsystem 3 subsystem 4 subsystem 5 Seed

88 Numerical study Mean square error t=1 t=4 t=10 t=20 t=50 t=100 t=150 t=200 t=250 t= Block index L = 20 =.4 N = J 1 = 20 J 2 =.2 1 =1 =.5

89 Numerical study Mean square error t=1 t=4 t=10 t=20 t=50 t=100 t=150 t=200 t=250 t=300 t = 10 decoding of first block Block index L = 20 =.4 N = J 1 = 20 J 2 =.2 1 =1 =.5

90 Numerical study Mean square error t = 10 decoding of first block t=1 t=4 t=10 t=20 t=50 t=100 t=150 t=200 t=250 t= Block index t = 100 decoding of blocks 1 to 9 L = 20 =.4 N = J 1 = 20 J 2 =.2 1 =1 =.5

91 Seeded-BP : a threshold-reaching strategy L 1 1 BP seeded BP α α L1 (ρ) α rbp (ρ) S-BEP α = ρ ρ L 1 Theory: seeded-bp threshold at = when L!1 phase transition line moves up when using seeding F

92 Optimal performance on artificial signals (sparse but with independent components).! More realistic signals?

93 L 1 BEP Compressed sensing : measure «random projections» and reconstruct the signal α = ρ! Methods Linear relaxation Candès-Tao, Donoho, Mean field algorithm S-BEP α = 0.6 α = 0.5 α = 0.4 α = 0.3 α = 0.2 Mean field + special strategy to avoid glass Krzakala et al. glass transition Original image = compression factor optimal compression (information-theory)

94 Bird s eye view Failure of the «representative agent» mean-field idea Impossible to describe spin glasses (or structural glasses) with a single effective spin : misses all the richness Go beyond effective medium for e.g. epidemic propagation (SIR) or gene expression : beyond the law of mass action «Whom or why does the effective agent represent?» (A. Kirman) ; Minority games Substituted by statistical approaches to the heterogeneous behaviors of individual agents

95 Bird s eye view «Representative agent» mean-field idea is substituted by statistical approaches to the heterogeneous behaviors of individual agents + simulations and algorithms More complicated («ordered phases», equilibration times,etc.), but much much richer. This «curse of dimensionality, poses several challenges to both technical and conceptual progress in neuroscience» ( disciplines (big data) Infinitely complex???

96 The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

97 The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Two amusing remarks: Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

98 The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Two amusing remarks: Physical spin glasses still far from well understood Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

99 The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Two amusing remarks: Physical spin glasses still far from well understood Useless spin glasses have found applications in fields that were totally unexpected Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

100

101 0 y 1 0 F 1 0 s 1 Structured measurement matrix. Variances of the matrix elements F µi = 1 J 2 J 1 1 J 2 = J 1 J B C J 1 1 J 2 C J 1 1 J 2 J 1 1 J 2 A 0 J 1 1 J 2 J 1 1 : unit coupling : coupling J 1 : coupling J 2 : no coupling (null elements) C A independent random Gaussian variables, zero mean and variance Jb(µ)b(i) /N

102 0 y 1 0 F 1 0 s 1 Block 1 has a large value of M such that the solution arise in this block and then propagate sin the whole system! 1 J 2 J 1 1 J 2 = J 1 J B C J 1 1 J 2 C J 1 1 J 2 J 1 1 J 2 A 0 J 1 1 J 2 J 1 1 : unit coupling : coupling J 1 : coupling J 2 : no coupling (null elements) C A L =8 N i = N/L M i = i N/L 1 > BP j = 0 < BP j 2 = 1 L ( 1 +(L 1) 0 )

103 An example: tomography of binary mixtures

104 An example: tomography of binary mixtures L measurements for each angle Synchrotron light

105 An example: tomography of binary mixtures L measurements for each angle Synchrotron light L angles: L 2 measurements L 2 pixels

106 An example: tomography of binary mixtures L angles: L 2 measurements L 2 pixels size 1 size 1/L If the size of domains is reconstruct with L 2 pixel: possible to measurements

107 An example: tomography of binary mixtures size 1 size 1/L If the size of domains is reconstruct with L 2 pixel: possible to measurements

108 An example: tomography of binary mixtures This picture, digitalized on grid, can be reconstructed fom measurements with 16 angles size 1 size 1/L Gouillart et al., arxiv If the size of domains is reconstruct with L 2 pixel: possible to measurements

Information, Physics, and Computation

Information, Physics, and Computation Information, Physics, and Computation Marc Mezard Laboratoire de Physique Thdorique et Moales Statistiques, CNRS, and Universit y Paris Sud Andrea Montanari Department of Electrical Engineering and Department

More information

On convergence of Approximate Message Passing

On convergence of Approximate Message Passing On convergence of Approximate Message Passing Francesco Caltagirone (1), Florent Krzakala (2) and Lenka Zdeborova (1) (1) Institut de Physique Théorique, CEA Saclay (2) LPS, Ecole Normale Supérieure, Paris

More information

Sparse Superposition Codes for the Gaussian Channel

Sparse Superposition Codes for the Gaussian Channel Sparse Superposition Codes for the Gaussian Channel Florent Krzakala (LPS, Ecole Normale Supérieure, France) J. Barbier (ENS) arxiv:1403.8024 presented at ISIT 14 Long version in preparation Communication

More information

The cavity method. Vingt ans après

The cavity method. Vingt ans après The cavity method Vingt ans après Les Houches lectures 1982 Early days with Giorgio SK model E = J ij s i s j i

More information

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES Lenka Zdeborová (CEA Saclay and CNRS, France) MAIN CONTRIBUTORS TO THE PHYSICS UNDERSTANDING OF RANDOM INSTANCES Braunstein, Franz, Kabashima, Kirkpatrick,

More information

Risk and Noise Estimation in High Dimensional Statistics via State Evolution

Risk and Noise Estimation in High Dimensional Statistics via State Evolution Risk and Noise Estimation in High Dimensional Statistics via State Evolution Mohsen Bayati Stanford University Joint work with Jose Bento, Murat Erdogdu, Marc Lelarge, and Andrea Montanari Statistical

More information

Belief propagation decoding of quantum channels by passing quantum messages

Belief propagation decoding of quantum channels by passing quantum messages Belief propagation decoding of quantum channels by passing quantum messages arxiv:67.4833 QIP 27 Joseph M. Renes lempelziv@flickr To do research in quantum information theory, pick a favorite text on classical

More information

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

Introduction to Low-Density Parity Check Codes. Brian Kurkoski Introduction to Low-Density Parity Check Codes Brian Kurkoski kurkoski@ice.uec.ac.jp Outline: Low Density Parity Check Codes Review block codes History Low Density Parity Check Codes Gallager s LDPC code

More information

ECEN 655: Advanced Channel Coding

ECEN 655: Advanced Channel Coding ECEN 655: Advanced Channel Coding Course Introduction Henry D. Pfister Department of Electrical and Computer Engineering Texas A&M University ECEN 655: Advanced Channel Coding 1 / 19 Outline 1 History

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Phase Transitions (and their meaning) in Random Constraint Satisfaction Problems

Phase Transitions (and their meaning) in Random Constraint Satisfaction Problems International Workshop on Statistical-Mechanical Informatics 2007 Kyoto, September 17 Phase Transitions (and their meaning) in Random Constraint Satisfaction Problems Florent Krzakala In collaboration

More information

CSCI 2570 Introduction to Nanocomputing

CSCI 2570 Introduction to Nanocomputing CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication

More information

Phase Transitions in Spin Glasses

Phase Transitions in Spin Glasses Phase Transitions in Spin Glasses Peter Young Talk available at http://physics.ucsc.edu/ peter/talks/sinica.pdf e-mail:peter@physics.ucsc.edu Supported by the Hierarchical Systems Research Foundation.

More information

Statistical mechanics of low-density parity check error-correcting codes over Galois fields

Statistical mechanics of low-density parity check error-correcting codes over Galois fields EUROPHYSICS LETTERS 15 November 2001 Europhys. Lett., 56 (4), pp. 610 616 (2001) Statistical mechanics of low-density parity check error-correcting codes over Galois fields K. Nakamura 1 ( ), Y. Kabashima

More information

Phase Transitions in Spin Glasses

Phase Transitions in Spin Glasses p.1 Phase Transitions in Spin Glasses Peter Young http://physics.ucsc.edu/ peter/talks/bifi2008.pdf e-mail:peter@physics.ucsc.edu Work supported by the and the Hierarchical Systems Research Foundation.

More information

Phase transitions in discrete structures

Phase transitions in discrete structures Phase transitions in discrete structures Amin Coja-Oghlan Goethe University Frankfurt Overview 1 The physics approach. [following Mézard, Montanari 09] Basics. Replica symmetry ( Belief Propagation ).

More information

An Introduction to Low Density Parity Check (LDPC) Codes

An Introduction to Low Density Parity Check (LDPC) Codes An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,

More information

Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem

Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem Osamu Watanabe, Takeshi Sawai, and Hayato Takahashi Dept. of Mathematical and Computing Sciences, Tokyo Institute of Technology

More information

Low-density parity-check codes

Low-density parity-check codes Low-density parity-check codes From principles to practice Dr. Steve Weller steven.weller@newcastle.edu.au School of Electrical Engineering and Computer Science The University of Newcastle, Callaghan,

More information

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I

More information

Spin Glass Approach to Restricted Isometry Constant

Spin Glass Approach to Restricted Isometry Constant Spin Glass Approach to Restricted Isometry Constant Ayaka Sakata 1,Yoshiyuki Kabashima 2 1 Institute of Statistical Mathematics 2 Tokyo Institute of Technology 1/29 Outline Background: Compressed sensing

More information

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,

More information

Graph-based codes for flash memory

Graph-based codes for flash memory 1/28 Graph-based codes for flash memory Discrete Mathematics Seminar September 3, 2013 Katie Haymaker Joint work with Professor Christine Kelley University of Nebraska-Lincoln 2/28 Outline 1 Background

More information

Noisy channel communication

Noisy channel communication Information Theory http://www.inf.ed.ac.uk/teaching/courses/it/ Week 6 Communication channels and Information Some notes on the noisy channel setup: Iain Murray, 2012 School of Informatics, University

More information

Entropies & Information Theory

Entropies & Information Theory Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information

More information

Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines. Eric W. Tramel. itwist 2016 Aalborg, DK 24 August 2016

Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines. Eric W. Tramel. itwist 2016 Aalborg, DK 24 August 2016 Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines Eric W. Tramel itwist 2016 Aalborg, DK 24 August 2016 Andre MANOEL, Francesco CALTAGIRONE, Marylou GABRIE, Florent

More information

Threshold Saturation on Channels with Memory via Spatial Coupling

Threshold Saturation on Channels with Memory via Spatial Coupling Threshold Saturation on Channels with Memory via Spatial Coupling Shrinivas Kudekar and Kenta Kasai New Mexico Consortium and Center for Non-linear Studies, Los Alamos National Laboratory, NM, USA Email:

More information

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,

More information

Spin glasses, where do we stand?

Spin glasses, where do we stand? Spin glasses, where do we stand? Giorgio Parisi Many progresses have recently done in spin glasses: theory, experiments, simulations and theorems! In this talk I will present: A very brief introduction

More information

Performance Regions in Compressed Sensing from Noisy Measurements

Performance Regions in Compressed Sensing from Noisy Measurements Performance egions in Compressed Sensing from Noisy Measurements Junan Zhu and Dror Baron Department of Electrical and Computer Engineering North Carolina State University; aleigh, NC 27695, USA Email:

More information

Mind the gap Solving optimization problems with a quantum computer

Mind the gap Solving optimization problems with a quantum computer Mind the gap Solving optimization problems with a quantum computer A.P. Young http://physics.ucsc.edu/~peter Work supported by Talk at Saarbrücken University, November 5, 2012 Collaborators: I. Hen, E.

More information

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes EE229B - Final Project Capacity-Approaching Low-Density Parity-Check Codes Pierre Garrigues EECS department, UC Berkeley garrigue@eecs.berkeley.edu May 13, 2005 Abstract The class of low-density parity-check

More information

Compressed Sensing and Linear Codes over Real Numbers

Compressed Sensing and Linear Codes over Real Numbers Compressed Sensing and Linear Codes over Real Numbers Henry D. Pfister (joint with Fan Zhang) Texas A&M University College Station Information Theory and Applications Workshop UC San Diego January 31st,

More information

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment Jean-Pierre Nadal CNRS & EHESS Laboratoire de Physique Statistique (LPS, UMR 8550 CNRS - ENS UPMC Univ.

More information

Propagating beliefs in spin glass models. Abstract

Propagating beliefs in spin glass models. Abstract Propagating beliefs in spin glass models Yoshiyuki Kabashima arxiv:cond-mat/0211500v1 [cond-mat.dis-nn] 22 Nov 2002 Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology,

More information

Chapter 9 Fundamental Limits in Information Theory

Chapter 9 Fundamental Limits in Information Theory Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For

More information

Other Topics in Quantum Information

Other Topics in Quantum Information p. 1/23 Other Topics in Quantum Information In a course like this there is only a limited time, and only a limited number of topics can be covered. Some additional topics will be covered in the class projects.

More information

On the number of circuits in random graphs. Guilhem Semerjian. [ joint work with Enzo Marinari and Rémi Monasson ]

On the number of circuits in random graphs. Guilhem Semerjian. [ joint work with Enzo Marinari and Rémi Monasson ] On the number of circuits in random graphs Guilhem Semerjian [ joint work with Enzo Marinari and Rémi Monasson ] [ Europhys. Lett. 73, 8 (2006) ] and [ cond-mat/0603657 ] Orsay 13-04-2006 Outline of the

More information

LDPC Codes. Intracom Telecom, Peania

LDPC Codes. Intracom Telecom, Peania LDPC Codes Alexios Balatsoukas-Stimming and Athanasios P. Liavas Technical University of Crete Dept. of Electronic and Computer Engineering Telecommunications Laboratory December 16, 2011 Intracom Telecom,

More information

Low-Density Parity-Check Codes

Low-Density Parity-Check Codes Department of Computer Sciences Applied Algorithms Lab. July 24, 2011 Outline 1 Introduction 2 Algorithms for LDPC 3 Properties 4 Iterative Learning in Crowds 5 Algorithm 6 Results 7 Conclusion PART I

More information

16.36 Communication Systems Engineering

16.36 Communication Systems Engineering MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication

More information

An introduction to basic information theory. Hampus Wessman

An introduction to basic information theory. Hampus Wessman An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on

More information

Reconstruction in the Generalized Stochastic Block Model

Reconstruction in the Generalized Stochastic Block Model Reconstruction in the Generalized Stochastic Block Model Marc Lelarge 1 Laurent Massoulié 2 Jiaming Xu 3 1 INRIA-ENS 2 INRIA-Microsoft Research Joint Centre 3 University of Illinois, Urbana-Champaign GDR

More information

Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, inte

Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, inte Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, interesting issue. Mainly work with: Giorgio Parisi, Federico

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Spatially Coupled LDPC Codes

Spatially Coupled LDPC Codes Spatially Coupled LDPC Codes Kenta Kasai Tokyo Institute of Technology 30 Aug, 2013 We already have very good codes. Efficiently-decodable asymptotically capacity-approaching codes Irregular LDPC Codes

More information

Physics on Random Graphs

Physics on Random Graphs Physics on Random Graphs Lenka Zdeborová (CNLS + T-4, LANL) in collaboration with Florent Krzakala (see MRSEC seminar), Marc Mezard, Marco Tarzia... Take Home Message Solving problems on random graphs

More information

Making Error Correcting Codes Work for Flash Memory

Making Error Correcting Codes Work for Flash Memory Making Error Correcting Codes Work for Flash Memory Part I: Primer on ECC, basics of BCH and LDPC codes Lara Dolecek Laboratory for Robust Information Systems (LORIS) Center on Development of Emerging

More information

THERE is a deep connection between the theory of linear

THERE is a deep connection between the theory of linear 664 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 2, FEBRUARY 2007 Griffith Kelly Sherman Correlation Inequalities: A Useful Tool in the Theory of Error Correcting Codes Nicolas Macris, Member,

More information

LDPC Codes. Slides originally from I. Land p.1

LDPC Codes. Slides originally from I. Land p.1 Slides originally from I. Land p.1 LDPC Codes Definition of LDPC Codes Factor Graphs to use in decoding Decoding for binary erasure channels EXIT charts Soft-Output Decoding Turbo principle applied to

More information

Quasi-cyclic Low Density Parity Check codes with high girth

Quasi-cyclic Low Density Parity Check codes with high girth Quasi-cyclic Low Density Parity Check codes with high girth, a work with Marta Rossi, Richard Bresnan, Massimilliano Sala Summer Doctoral School 2009 Groebner bases, Geometric codes and Order Domains Dept

More information

Statistical mechanics and capacity-approaching error-correctingcodes

Statistical mechanics and capacity-approaching error-correctingcodes Physica A 302 (2001) 14 21 www.elsevier.com/locate/physa Statistical mechanics and capacity-approaching error-correctingcodes Nicolas Sourlas Laboratoire de Physique Theorique de l, UMR 8549, Unite Mixte

More information

Part III Advanced Coding Techniques

Part III Advanced Coding Techniques Part III Advanced Coding Techniques José Vieira SPL Signal Processing Laboratory Departamento de Electrónica, Telecomunicações e Informática / IEETA Universidade de Aveiro, Portugal 2010 José Vieira (IEETA,

More information

arxiv: v2 [cond-mat.dis-nn] 4 Dec 2008

arxiv: v2 [cond-mat.dis-nn] 4 Dec 2008 Constraint satisfaction problems with isolated solutions are hard Lenka Zdeborová Université Paris-Sud, LPTMS, UMR8626, Bât. 00, Université Paris-Sud 9405 Orsay cedex CNRS, LPTMS, UMR8626, Bât. 00, Université

More information

CS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability

More information

Tractable Upper Bounds on the Restricted Isometry Constant

Tractable Upper Bounds on the Restricted Isometry Constant Tractable Upper Bounds on the Restricted Isometry Constant Alex d Aspremont, Francis Bach, Laurent El Ghaoui Princeton University, École Normale Supérieure, U.C. Berkeley. Support from NSF, DHS and Google.

More information

Lecture 4 : Introduction to Low-density Parity-check Codes

Lecture 4 : Introduction to Low-density Parity-check Codes Lecture 4 : Introduction to Low-density Parity-check Codes LDPC codes are a class of linear block codes with implementable decoders, which provide near-capacity performance. History: 1. LDPC codes were

More information

Linear and conic programming relaxations: Graph structure and message-passing

Linear and conic programming relaxations: Graph structure and message-passing Linear and conic programming relaxations: Graph structure and message-passing Martin Wainwright UC Berkeley Departments of EECS and Statistics Banff Workshop Partially supported by grants from: National

More information

Long Range Frustration in Finite Connectivity Spin Glasses: Application to the random K-satisfiability problem

Long Range Frustration in Finite Connectivity Spin Glasses: Application to the random K-satisfiability problem arxiv:cond-mat/0411079v1 [cond-mat.dis-nn] 3 Nov 2004 Long Range Frustration in Finite Connectivity Spin Glasses: Application to the random K-satisfiability problem Haijun Zhou Max-Planck-Institute of

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes Brian M. Kurkoski, Kazuhiko Yamaguchi and Kingo Kobayashi kurkoski@ice.uec.ac.jp Dept. of Information and Communications Engineering

More information

Communication by Regression: Sparse Superposition Codes

Communication by Regression: Sparse Superposition Codes Communication by Regression: Sparse Superposition Codes Department of Statistics, Yale University Coauthors: Antony Joseph and Sanghee Cho February 21, 2013, University of Texas Channel Communication Set-up

More information

Mind the gap Solving optimization problems with a quantum computer

Mind the gap Solving optimization problems with a quantum computer Mind the gap Solving optimization problems with a quantum computer A.P. Young http://physics.ucsc.edu/~peter Work supported by Talk at the London Centre for Nanotechnology, October 17, 2012 Collaborators:

More information

Why is Deep Learning so effective?

Why is Deep Learning so effective? Ma191b Winter 2017 Geometry of Neuroscience The unreasonable effectiveness of deep learning This lecture is based entirely on the paper: Reference: Henry W. Lin and Max Tegmark, Why does deep and cheap

More information

Shannon's Theory of Communication

Shannon's Theory of Communication Shannon's Theory of Communication An operational introduction 5 September 2014, Introduction to Information Systems Giovanni Sileno g.sileno@uva.nl Leibniz Center for Law University of Amsterdam Fundamental

More information

Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts

Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts Benjamin Hsu December 11, 2007 Abstract A spin glass describes a system of spins on a lattice (or a crystal) where the

More information

Is the Sherrington-Kirkpatrick Model relevant for real spin glasses?

Is the Sherrington-Kirkpatrick Model relevant for real spin glasses? Is the Sherrington-Kirkpatrick Model relevant for real spin glasses? A. P. Young Department of Physics, University of California, Santa Cruz, California 95064 E-mail: peter@physics.ucsc.edu Abstract. I

More information

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels 2012 IEEE International Symposium on Information Theory Proceedings On Generalied EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels H Mamani 1, H Saeedi 1, A Eslami

More information

The non-backtracking operator

The non-backtracking operator The non-backtracking operator Florent Krzakala LPS, Ecole Normale Supérieure in collaboration with Paris: L. Zdeborova, A. Saade Rome: A. Decelle Würzburg: J. Reichardt Santa Fe: C. Moore, P. Zhang Berkeley:

More information

Approximate counting of large subgraphs in random graphs with statistical mechanics methods

Approximate counting of large subgraphs in random graphs with statistical mechanics methods Approximate counting of large subgraphs in random graphs with statistical mechanics methods Guilhem Semerjian LPT-ENS Paris 13.03.08 / Eindhoven in collaboration with Rémi Monasson, Enzo Marinari and Valery

More information

The Last Survivor: a Spin Glass Phase in an External Magnetic Field.

The Last Survivor: a Spin Glass Phase in an External Magnetic Field. The Last Survivor: a Spin Glass Phase in an External Magnetic Field. J. J. Ruiz-Lorenzo Dep. Física, Universidad de Extremadura Instituto de Biocomputación y Física de los Sistemas Complejos (UZ) http://www.eweb.unex.es/eweb/fisteor/juan

More information

An Introduction to Algorithmic Coding Theory

An Introduction to Algorithmic Coding Theory An Introduction to Algorithmic Coding Theory M. Amin Shokrollahi Bell Laboratories Part : Codes - A puzzle What do the following problems have in common? 2 Problem : Information Transmission MESSAGE G

More information

arxiv: v2 [cs.it] 6 Sep 2016

arxiv: v2 [cs.it] 6 Sep 2016 The Mutual Information in Random Linear Estimation Jean Barbier, Mohamad Dia, Nicolas Macris and Florent Krzakala Laboratoire de Théorie des Communications, Faculté Informatique et Communications, Ecole

More information

Sample Complexity of Bayesian Optimal Dictionary Learning

Sample Complexity of Bayesian Optimal Dictionary Learning Sample Complexity of Bayesian Optimal Dictionary Learning Ayaka Sakata and Yoshiyuki Kabashima Dep. of Computational Intelligence & Systems Science Tokyo Institute of Technology Yokohama 6-85, Japan Email:

More information

Statistical Mechanics of Multi Terminal Data Compression

Statistical Mechanics of Multi Terminal Data Compression MEXT Grant-in-Aid for Scientific Research on Priority Areas Statistical Mechanical Approach to Probabilistic Information Processing Workshop on Mathematics of Statistical Inference (December 004, Tohoku

More information

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

CHAPTER 3 LOW DENSITY PARITY CHECK CODES 62 CHAPTER 3 LOW DENSITY PARITY CHECK CODES 3. INTRODUCTION LDPC codes were first presented by Gallager in 962 [] and in 996, MacKay and Neal re-discovered LDPC codes.they proved that these codes approach

More information

Time-invariant LDPC convolutional codes

Time-invariant LDPC convolutional codes Time-invariant LDPC convolutional codes Dimitris Achlioptas, Hamed Hassani, Wei Liu, and Rüdiger Urbanke Department of Computer Science, UC Santa Cruz, USA Email: achlioptas@csucscedu Department of Computer

More information

Lecture 8: Shannon s Noise Models

Lecture 8: Shannon s Noise Models Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 8: Shannon s Noise Models September 14, 2007 Lecturer: Atri Rudra Scribe: Sandipan Kundu& Atri Rudra Till now we have

More information

3F1 Information Theory, Lecture 3

3F1 Information Theory, Lecture 3 3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2013, 29 November 2013 Memoryless Sources Arithmetic Coding Sources with Memory Markov Example 2 / 21 Encoding the output

More information

Recent Developments in Compressed Sensing

Recent Developments in Compressed Sensing Recent Developments in Compressed Sensing M. Vidyasagar Distinguished Professor, IIT Hyderabad m.vidyasagar@iith.ac.in, www.iith.ac.in/ m vidyasagar/ ISL Seminar, Stanford University, 19 April 2018 Outline

More information

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n Finite size eects and error-free communication in Gaussian channels Ido Kanter and David Saad # Minerva Center and Department of Physics, Bar-Ilan University, Ramat-Gan 52900, Israel. # The Neural Computing

More information

Codes on graphs and iterative decoding

Codes on graphs and iterative decoding Codes on graphs and iterative decoding Bane Vasić Error Correction Coding Laboratory University of Arizona Prelude Information transmission 0 0 0 0 0 0 Channel Information transmission signal 0 0 threshold

More information

Information Theory (Information Theory by J. V. Stone, 2015)

Information Theory (Information Theory by J. V. Stone, 2015) Information Theory (Information Theory by J. V. Stone, 2015) Claude Shannon (1916 2001) Shannon, C. (1948). A mathematical theory of communication. Bell System Technical Journal, 27:379 423. A mathematical

More information

Growth Rate of Spatially Coupled LDPC codes

Growth Rate of Spatially Coupled LDPC codes Growth Rate of Spatially Coupled LDPC codes Workshop on Spatially Coupled Codes and Related Topics at Tokyo Institute of Technology 2011/2/19 Contents 1. Factor graph, Bethe approximation and belief propagation

More information

ECC for NAND Flash. Osso Vahabzadeh. TexasLDPC Inc. Flash Memory Summit 2017 Santa Clara, CA 1

ECC for NAND Flash. Osso Vahabzadeh. TexasLDPC Inc. Flash Memory Summit 2017 Santa Clara, CA 1 ECC for NAND Flash Osso Vahabzadeh TexasLDPC Inc. 1 Overview Why Is Error Correction Needed in Flash Memories? Error Correction Codes Fundamentals Low-Density Parity-Check (LDPC) Codes LDPC Encoding and

More information

Using a Hopfield Network: A Nuts and Bolts Approach

Using a Hopfield Network: A Nuts and Bolts Approach Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of

More information

Phase Transitions in Physics and Computer Science. Cristopher Moore University of New Mexico and the Santa Fe Institute

Phase Transitions in Physics and Computer Science. Cristopher Moore University of New Mexico and the Santa Fe Institute Phase Transitions in Physics and Computer Science Cristopher Moore University of New Mexico and the Santa Fe Institute Magnetism When cold enough, Iron will stay magnetized, and even magnetize spontaneously

More information

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes. 5 Binary Codes You have already seen how check digits for bar codes (in Unit 3) and ISBN numbers (Unit 4) are used to detect errors. Here you will look at codes relevant for data transmission, for example,

More information

Phase Transitions in the Coloring of Random Graphs

Phase Transitions in the Coloring of Random Graphs Phase Transitions in the Coloring of Random Graphs Lenka Zdeborová (LPTMS, Orsay) In collaboration with: F. Krząkała (ESPCI Paris) G. Semerjian (ENS Paris) A. Montanari (Standford) F. Ricci-Tersenghi (La

More information

Low-Density Parity-Check Codes A Statistical Physics Perspective

Low-Density Parity-Check Codes A Statistical Physics Perspective ADVANCES IN IMAGING AND ELECTRON PHYSICS, VOL. 125 Low-Density Parity-Check Codes A Statistical Physics Perspective RENATO VICENTE, 1, DAVID SAAD 1 AND YOSHIYUKI KABASHIMA 2 1 Neural Computing Research

More information

arxiv:cond-mat/ v3 16 Sep 2003

arxiv:cond-mat/ v3 16 Sep 2003 replica equivalence in the edwards-anderson model arxiv:cond-mat/0302500 v3 16 Sep 2003 Pierluigi Contucci Dipartimento di Matematica Università di Bologna, 40127 Bologna, Italy e-mail: contucci@dm.unibo.it

More information

Learning the dynamics of biological networks

Learning the dynamics of biological networks Learning the dynamics of biological networks Thierry Mora ENS Paris A.Walczak Università Sapienza Rome A. Cavagna I. Giardina O. Pohl E. Silvestri M.Viale Aberdeen University F. Ginelli Laboratoire de

More information

Belief-Propagation Decoding of LDPC Codes

Belief-Propagation Decoding of LDPC Codes LDPC Codes: Motivation Belief-Propagation Decoding of LDPC Codes Amir Bennatan, Princeton University Revolution in coding theory Reliable transmission, rates approaching capacity. BIAWGN, Rate =.5, Threshold.45

More information

Quantum Annealing in spin glasses and quantum computing Anders W Sandvik, Boston University

Quantum Annealing in spin glasses and quantum computing Anders W Sandvik, Boston University PY502, Computational Physics, December 12, 2017 Quantum Annealing in spin glasses and quantum computing Anders W Sandvik, Boston University Advancing Research in Basic Science and Mathematics Example:

More information

Hadamard Codes. A Hadamard matrix of order n is a matrix H n with elements 1 or 1 such that H n H t n = n I n. For example,

Hadamard Codes. A Hadamard matrix of order n is a matrix H n with elements 1 or 1 such that H n H t n = n I n. For example, Coding Theory Massoud Malek Hadamard Codes A Hadamard matrix of order n is a matrix H n with elements 1 or 1 such that H n H t n = n I n. For example, [ ] 1 1 1 1 1 1 1 1 1 1 H 1 = [1], H 2 =, H 1 1 4

More information

Mind the gap Solving optimization problems with a quantum computer

Mind the gap Solving optimization problems with a quantum computer Mind the gap Solving optimization problems with a quantum computer A.P. Young http://physics.ucsc.edu/~peter Work supported by NASA future technologies conference, January 17-212, 2012 Collaborators: Itay

More information

Is there a de Almeida-Thouless line in finite-dimensional spin glasses? (and why you should care)

Is there a de Almeida-Thouless line in finite-dimensional spin glasses? (and why you should care) Is there a de Almeida-Thouless line in finite-dimensional spin glasses? (and why you should care) Peter Young Talk at MPIPKS, September 12, 2013 Available on the web at http://physics.ucsc.edu/~peter/talks/mpipks.pdf

More information

Ma/CS 6b Class 25: Error Correcting Codes 2

Ma/CS 6b Class 25: Error Correcting Codes 2 Ma/CS 6b Class 25: Error Correcting Codes 2 By Adam Sheffer Recall: Codes V n the set of binary sequences of length n. For example, V 3 = 000,001,010,011,100,101,110,111. Codes of length n are subsets

More information

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG Cung Nguyen and Robert G. Redinbo Department of Electrical and Computer Engineering University of California, Davis, CA email: cunguyen,

More information

On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble

On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble 5 Conference on Information Sciences and Systems The Johns Hopkins University March 16 18 5 On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble C.-C. Wang S.R. Kulkarni and H.V. Poor

More information