The spin-glass cornucopia

Similar documents
Information, Physics, and Computation

On convergence of Approximate Message Passing

Sparse Superposition Codes for the Gaussian Channel

The cavity method. Vingt ans après

THE PHYSICS OF COUNTING AND SAMPLING ON RANDOM INSTANCES. Lenka Zdeborová

Risk and Noise Estimation in High Dimensional Statistics via State Evolution

Belief propagation decoding of quantum channels by passing quantum messages

Introduction to Low-Density Parity Check Codes. Brian Kurkoski

ECEN 655: Advanced Channel Coding

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Phase Transitions (and their meaning) in Random Constraint Satisfaction Problems

CSCI 2570 Introduction to Nanocomputing

Phase Transitions in Spin Glasses

Statistical mechanics of low-density parity check error-correcting codes over Galois fields

Phase Transitions in Spin Glasses

Phase transitions in discrete structures

An Introduction to Low Density Parity Check (LDPC) Codes

Analysis of a Randomized Local Search Algorithm for LDPCC Decoding Problem

Low-density parity-check codes

Digital Communications III (ECE 154C) Introduction to Coding and Information Theory

Spin Glass Approach to Restricted Isometry Constant

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Graph-based codes for flash memory

Noisy channel communication

Entropies & Information Theory

Inferring Sparsity: Compressed Sensing Using Generalized Restricted Boltzmann Machines. Eric W. Tramel. itwist 2016 Aalborg, DK 24 August 2016

Threshold Saturation on Channels with Memory via Spatial Coupling

Introduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah

Spin glasses, where do we stand?

Performance Regions in Compressed Sensing from Noisy Measurements

Mind the gap Solving optimization problems with a quantum computer

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

Compressed Sensing and Linear Codes over Real Numbers

Neural coding Ecological approach to sensory coding: efficient adaptation to the natural environment

Propagating beliefs in spin glass models. Abstract

Chapter 9 Fundamental Limits in Information Theory

Other Topics in Quantum Information

On the number of circuits in random graphs. Guilhem Semerjian. [ joint work with Enzo Marinari and Rémi Monasson ]

LDPC Codes. Intracom Telecom, Peania

Low-Density Parity-Check Codes

16.36 Communication Systems Engineering

An introduction to basic information theory. Hampus Wessman

Reconstruction in the Generalized Stochastic Block Model

Learning About Spin Glasses Enzo Marinari (Cagliari, Italy) I thank the organizers... I am thrilled since... (Realistic) Spin Glasses: a debated, inte

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Spatially Coupled LDPC Codes

Physics on Random Graphs

Making Error Correcting Codes Work for Flash Memory

THERE is a deep connection between the theory of linear

LDPC Codes. Slides originally from I. Land p.1

Quasi-cyclic Low Density Parity Check codes with high girth

Statistical mechanics and capacity-approaching error-correctingcodes

Part III Advanced Coding Techniques

arxiv: v2 [cond-mat.dis-nn] 4 Dec 2008


Tractable Upper Bounds on the Restricted Isometry Constant

Lecture 4 : Introduction to Low-density Parity-check Codes

Linear and conic programming relaxations: Graph structure and message-passing

Long Range Frustration in Finite Connectivity Spin Glasses: Application to the random K-satisfiability problem

Shannon s noisy-channel theorem

Single-Gaussian Messages and Noise Thresholds for Low-Density Lattice Codes

Communication by Regression: Sparse Superposition Codes

Mind the gap Solving optimization problems with a quantum computer

Why is Deep Learning so effective?

Shannon's Theory of Communication

Through Kaleidoscope Eyes Spin Glasses Experimental Results and Theoretical Concepts

Is the Sherrington-Kirkpatrick Model relevant for real spin glasses?

On Generalized EXIT Charts of LDPC Code Ensembles over Binary-Input Output-Symmetric Memoryless Channels

The non-backtracking operator

Approximate counting of large subgraphs in random graphs with statistical mechanics methods

The Last Survivor: a Spin Glass Phase in an External Magnetic Field.

An Introduction to Algorithmic Coding Theory

arxiv: v2 [cs.it] 6 Sep 2016

Sample Complexity of Bayesian Optimal Dictionary Learning

Statistical Mechanics of Multi Terminal Data Compression

CHAPTER 3 LOW DENSITY PARITY CHECK CODES

Time-invariant LDPC convolutional codes

Lecture 8: Shannon s Noise Models

3F1 Information Theory, Lecture 3

Recent Developments in Compressed Sensing

2 Information transmission is typically corrupted by noise during transmission. Various strategies have been adopted for reducing or eliminating the n

Codes on graphs and iterative decoding

Information Theory (Information Theory by J. V. Stone, 2015)

Growth Rate of Spatially Coupled LDPC codes

ECC for NAND Flash. Osso Vahabzadeh. TexasLDPC Inc. Flash Memory Summit 2017 Santa Clara, CA 1

Using a Hopfield Network: A Nuts and Bolts Approach

Phase Transitions in Physics and Computer Science. Cristopher Moore University of New Mexico and the Santa Fe Institute

channel of communication noise Each codeword has length 2, and all digits are either 0 or 1. Such codes are called Binary Codes.

Phase Transitions in the Coloring of Random Graphs

Low-Density Parity-Check Codes A Statistical Physics Perspective

arxiv:cond-mat/ v3 16 Sep 2003

Learning the dynamics of biological networks

Belief-Propagation Decoding of LDPC Codes

Quantum Annealing in spin glasses and quantum computing Anders W Sandvik, Boston University

Hadamard Codes. A Hadamard matrix of order n is a matrix H n with elements 1 or 1 such that H n H t n = n I n. For example,

Mind the gap Solving optimization problems with a quantum computer

Is there a de Almeida-Thouless line in finite-dimensional spin glasses? (and why you should care)

Ma/CS 6b Class 25: Error Correcting Codes 2

Fault Tolerance Technique in Huffman Coding applies to Baseline JPEG

On the Typicality of the Linear Code Among the LDPC Coset Code Ensemble

Transcription:

The spin-glass cornucopia Marc Mézard! Ecole normale supérieure - PSL Research University and CNRS, Université Paris Sud LPT-ENS, January 2015

40 years of research

n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys

n n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys 1975: Edwards and Anderson define a model, an order parameter, a new phase of matter

n n n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys 1975: Edwards and Anderson define a model, an order parameter, a new phase of matter 1978- : Theory develops along two lines: mean field theory (Sherrington-Kirkpatrick) and «real space» droplets

n n n n 40 years of research 70 s:anomalies in the magnetic response of some obscure and useless alloys 1975: Edwards and Anderson define a model, an order parameter, a new phase of matter 1978- : Theory develops along two lines: mean field theory (Sherrington-Kirkpatrick) and «real space» droplets 1979-1985: Solution of SK model by G. Parisi with replicas. Physical understanding (ultrametricity), cavity method (MPV) + 2000 (MP)

Where do we stand? Real spin glasses: still an open problem A very sophisticated and powerful corpus of conceptual and methodological approaches has been developed, and used in many different fields Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam

The cornucopia Statistical physics (glasses, disordered polymers ) Computer Science problems: satisfiability, coloring) Information Theory (error correction, inference Signal acquisition and processing - big data Significant information Gene expression networks Neural networks Interacting agents on a market

Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say)

Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say) Gradually : add dirt! Defects, dislocations, pinning sites, mixtures, walls, etc.

Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say) Gradually : add dirt! Defects, dislocations, pinning sites, mixtures, walls, etc. Last forty years : handle strongly disordered systems. Each atom has a different environment. Even the nature of the basic phases is hard to understand. Heterogeneous «agents»

Why? Beginning of statistical physics and condensed matter theory : homogeneous systems. All atoms are alike. Easy (so to say) Gradually : add dirt! Defects, dislocations, pinning sites, mixtures, walls, etc. Last forty years : handle strongly disordered systems. Each atom has a different environment. Even the nature of the basic phases is hard to understand. Heterogeneous «agents»

n n n Table of content Spin glasses dynamics, mean field description Information theory: phase transitions and glass phases in General setup: link to hardness) n Bird s eye view : compressed sensing

Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: J ij = J>0 At low T: spins align, P concentrates on

Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: J ij = J>0 Phases : hs i i = M «representative agent»

Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: J ij = J>0 Phases : hs i i = M «representative agent» 0 1 M = tanh @ X j J ij s j A ' tanh (zjm)

Magnets s i {±1} J ij j E = J ij s i s j i ij Equilibrium: P (s 1,...,s N )= 1 Z e E/T Ferromagnet: Phases : hs i i = M «representative agent» 0 J ij = J>0 1 M T Two states M = tanh @ X j J ij s j A ' tanh (zjm)

CuMn i J ij j Spin glasses s i {±1} E = J ij s i s j ij P (s 1,...,s N )= 1 Z e E/T Spin glass: J ij sign depends on ij frustration What is the behavior

Spin Glasses Linear response to a small magnetic field: i J ij j χ (a.u.) T=12 K New dynamics Memory T=10 K t t 1 3 T=12 K t t t 1 2 3 E. Vincent et al, SPEC 400 800 time (min)

Mean-field lessons Energy Energy Magnetization 1 N σ i s i Configurations Ising Spin glass 1- Glass «phase» : Many pure states, unrelated by symmetry, organized in a hierarchical «ultrametric» structure 2- Many metastable states, unrelated by symmetry 3- «True» ground state : fragile!

Take-home Glass phases Proliferation of states (exponentially many) Hierarchical structures Various time scales : aging, modification of FDT Technically : powerful methods - Replicas - Dynamical field theories - Cavity method self-consistent equations for m i, magnetization of spin i in state and statistical analysis

An example from Information Theory Claude Shannon (1916-2001)

An example from Information Theory Claude Shannon (1916-2001) Talk: S ~ Log N 2 or: S ~ N

An example from Information Theory More recent convergence: Error correcting codes, pool testing, compressed sensing Claude Shannon (1916-2001) Talk: S ~ Log N 2 or: S ~ N

An example from Information Theory More recent convergence: Error correcting codes, pool testing, compressed sensing The new frontier : infinitely complex information Claude Shannon (1916-2001) Talk: S ~ Log N 2 or: S ~ N

Information transmission and error correction

Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N

Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N e.g. repetition 0 000 1 111 rate =1/3

Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N e.g. repetition 0 000 1 111 rate =1/3 1011 111000111111 Code Channel 111000110 111 1 0 1 1 Majority decoding error probability p 3 +3p 2 (1 p) 3p 2

Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N Shannon s theorem: for a given noise level p, one can build a coder/decoder which transmits with r<r c (p)

Principle of error correction : redundancy L = Transmission Encoder Decoder Channel a x r a Original Encoded Received Estimate of the message message message original message N M bits N bits N bits L = N M bits Encoding = add redundancy. Rate L/N Shannon s theorem: for a given noise level p, one can build a coder/decoder which transmits with Two ingredients: - «Thermodynamic limit» N,L!1 - Ensemble of Random Codes ( Random Energy Model of spin glasses) r<r c (p)

Unit hypercube in dimensions N Shannon code ensemble codewords (random) sent codeword

Unit hypercube in dimensions N Shannon code ensemble codewords (random) sent codeword received word

Probability of perfect decoding: Phase transitions in decoding Decoding = find closest codeword 1 0 p = noise Shannon geometric phase transition = REM phase transition

Unit hypercube in dimensions N Shannon code ensemble codewords (random) sent codeword received word Perfect codes in principle, but impossible to decode in practice (structureless time ) O(e N )

Efficient codes : parity checks Shannon s code is useless (time O(e N ) ) Add redundancy, with structure allowing to decode a b c 1 2 3 4 5 7 6 x 1 + x 4 + x 5 + x 7 = 0 (mod 2) x 2 + x 4 + x 6 + x 7 = 0 (mod 2) x 3 + x 5 + x 6 + x 7 = 0 (mod 2) codewords 2 4 2 7 among words

Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable N = 20,M = 10,R=1/2,L=3,K =6

Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable

Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable Flipped bits

Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable violated parity constraints Flipped bits

Decoding =iteration of mean-field «cavity» equations. «Message passing» algorithm - «Belief propagation» Error decoding with parity checks variables, N 1 M = (1 R)N Random construction with K L equations variables per equation equations per variable violated parity constraints Flipped bits

Unit hypercube in dimensions N Error correction: decoding codewords sent codeword

Unit hypercube in dimensions N Error correction: decoding codewords sent codeword received word

Unit hypercube in dimensions N Error correction: decoding codewords sent codeword received word Seek a codeword at distance from received word pn

Unit hypercube in dimensions N Error correction: decoding codewords sent codeword received word metastable states Seek a codeword at distance from received word pn

Phase transitions in decoding Probability of perfect decoding: 1 0 p = noise Algo Max Prob Shannon

Phase transitions in decoding Probability of perfect decoding: 1 0 p = noise Algo Max Prob Shannon

Phase transitions in decoding Probability of perfect decoding: 1 Metastables states=traps 0 p = noise Algo Max Prob Shannon Glass phase

Phase transitions in decoding Probability of perfect decoding: 1 0 p = noise Algo Max Prob Shannon

Phase transitions in decoding Probability of perfect decoding: 1 0 Ex: 3-6 code on BSC p = noise Algo Max Prob Shannon p d =.084 p c =.101 p S =.110

Parity check code in easy phase, time O(N) 1 Phase transitions in decoding Parity-check code, in its glass phase, time O(e N ) «Ideal» code, time O(e N ) 0 p = noise Algo Max Prob Shannon

Parity check code in easy phase, time O(N) 1 Phase transitions in decoding Parity-check code, in its glass phase, time O(e N ) «Ideal» code, time O(e N ) 0 Ex: 3-6 code on BSC p = noise Algo Max Prob Shannon p d =.084 p c =.101 p S =.110

Example Repetition code k =3 s encoder t channel Coding Transmission f = 10% r decoder Decoding ŝ p =0.1 D. MacKay

Low density parity check code with rate 1/3 Transmission Decoding p =0.1 (1 f) 0 0 f 1 1 (1 f) 0 1 (1 f) f (1 f) 0 1 Errorless

Glasses and algorithms Phase transitions and glass phases are ubiquitous, in nature and in algorithms trying to solve constraint satisfaction problems with many variables and constraints. Phase transitions relate to the equilibrium system, reached after infinitely long (i.e. exponential) time Dynamic glass transitions relate to the practical performance of (polynomial time) algorithms Deep links with computer science

P versus NP SAT LDPC NP complete 3 Colouring TSP (d) 3SAT 3-d spin glass Hamiltonian cycle LDPC SAT TSP 3SAT 3-d spin glass (d) 3 Colouring Hamiltonian cycle NP NP = P = NP complete 2SAT Eulerian circuit P Assignment 2 colouring 2-d spin glass 2SAT Eulerian circuit Assignment 2 colouring 2-d spin glass Conjectured Possible P= NP P= NP

Links between NP-hardness and glass transition? Worst-case vs typical sample. Statistical physics = behavior of a typical large-size sample drawn from an ensemble - Importance of phase transitions - Importance of metastable states - Use of physics ideas to come up with new types of algorithms. Mean-field cavity message passing algorithms are among the most powerful algorithms for a large class of inference and constraint satisfaction problems

Example : Phase diagram of satisfiability and coloring SAT (E = 0 ) UNSAT (E >0) 0 0 1 state Many states E=0 E>0 Many states E>0 α α = 4.267 d c α =M/N Structural (geometrical) phase transition. Impacts all known algorithms. Survey propagation = best known algo

Significant data and knowledge extraction Images with pixels : 10 6 128 106 Instantaneous patterns of firing neurons : 2 1010

Significant data and knowledge extraction Images with pixels : 10 6 128 106 Instantaneous patterns of firing neurons : 2 1010 Youtube : «How to draw George W Busch in 15 seconds»:

Significant data, and knowledge extraction Extract significant information from high-dimensional data : For the scientist, the citizen, the neural network, the computer scientist (machine learning)! NB: JPEG compression : from to 20 000 10 6

Compressed sensing Possible applications:! - Rapid Magnetic Resonance Imaging - Tomography - Electronic microscopy - Fluorescence microscopy - Astronomy - Generalized linear regression - Group Testing - Infer regulatory interactions among many genes using only a limited number of experimental conditions -...

Compressed sensing Possible applications:! - Rapid Magnetic Resonance Imaging - Tomography - Electronic microscopy - Fluorescence microscopy - Astronomy - Generalized linear regression - Group Testing - Infer regulatory interactions among many genes using only a limited number of experimental conditions -... - How does the brain work?

The simplest problem: getting a signal from some measurement= linear transforms Consider a system of linear measurements Measurements y = 0 B @ y 1.. y M 1 C A y = Fs Signal F = M N matrix s = 0 B @ s 1... s N 1 C A Random F : «random projections» (incoherent with signal) s Pb: Find when M<N and is sparse s

Compressed sensing as an optimization problem: the norm approach L 1 Find a N - component vector x such that the M equations y = Fx are satisfied and x is minimal Hopefully: x = s x 0 : number of non-zero components x p = X i x i p Ideally, use x 0. In practice, use x 1 NB: «noisy version» : y = Fx+

Phase diagram - phase transitions Gaussian random matrix = M/N Number of measurements per variable Donoho 2006, Donoho Tanner 2005 Kabashima, Wadayama and Tanaka, JSTAT 2009 = R/N Fraction of nonzero variables Find a N - component vector x such that the M equations y = Fx are satisfied and x is minimal

Phase diagram - phase transitions Gaussian random matrix Number of measurements per variable = M/N Possible with L 1?? Donoho 2006, Donoho Tanner 2005 Kabashima, Wadayama and Tanaka, JSTAT 2009 Reconstruction impossible = R/N Fraction of nonzero variables Find a N - component vector x such that the M equations y = Fx are satisfied and x is minimal

Alternative approach, able to reach the optimal rate = Krzakala Sausset Mézard Sun Zdeborova 2011-2014 Probabilistic approach = inference Message passing reconstruction of the signal Careful design of the measurement matrix NB: each of these three ingredients is crucial

Step 1: Probabilistic approach to compressed sensing Probability P (x) that the signal is x: 1) x must be compatible with the measurements: 1I) A priori measure on xfavours sparsity Gauss-Bernoulli prior: with probability : with probability 1 x i =0 X i F µi x i = y µ : drawn from Gaussian distribution Theorem: with this measure, the original signal most probable (not obvious!) x = s is the

Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal more probable than other configurations. Efficient sampling? Not so easy.!!!!!! i x i = s i Gaussian is infinitely

Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal more probable than other configurations. Efficient sampling? Not so easy.!!!!!! i x i = s i Gaussian is infinitely

Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal x i = s i is infinitely more probable than other configurations. Efficient sampling? Not so easy.!! constraints! Each constraint involves! all the variables: «long-range»! weak interactions (e.g. Curie! Weiss model for magnets). Mean field is exact* i Gaussian

Step 2: Sampling from the constrained measure P (x) = NY [(1 ) (x i )+ (x i )] i=1 PY µ=1 y µ! X F µi x i «Native configuration»= stored signal x i = s i is infinitely more probable than other configurations. Efficient sampling? Not so easy.!! «Mean field»: constraints! Each constraint involves belief propagation! all the variables: «long-range» (spin glass mean-field! weak interactions (e.g. Curie equations, TAP)! Weiss model for magnets). Mean field is exact* i Gaussian

Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure)

Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure)

Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure) Closed self-consistent equations relating these order parameters («BP», «TAP», «G-AMP»,...)

Belief propagation = mean-field like equations µ i Local order parameters: a i!µ = hx i i µ v i!µ = hx 2 i i µ (hx i i µ ) 2 h.i µ where denotes the mean, in absence of constraint µ («cavity»-type measure) Closed self-consistent equations relating these order parameters («BP», «TAP», «G-AMP»,...) Four «messages» sent along each edge i µ ( numbers ) can be simplified to parameters 4NM O(N)

Performance of the probabilistic approach + message passing + parameter learning Simulations Analytic study of the large method, cavity method) N limit (replica

Analytic study: cavity equations, density evolution, replicas, state evolution Replica method allows to compute the «free entropy» (D) = lim N!1 1 N log P (D) where P (D) is the probability that reconstructed x is at distance from original signal s. D D = 1 N X (x i s i ) 2 Cavity method shows that the order parameters of the BP iteration flow according to the gradient of the replica free entropy («density evolution» eqns) analytic control of the BP equations NB rigorous: Bayati Montanari, Lelarge Montanari i

Φ(D) Free entropy 0.25 0.2 0.15 0.1 log P (D) α = 0.62 α = 0.6 α = 0.58 α = 0.56 0 =.4 Number of iterations 10000 3000 1000 300 100 30 BP convergence time α=ρ α rbp α L1 rbp Seeded BEP - L=10 Seeded BEP - L=40 0 0.05 0.1 0.15 0.2 0.25 0.3 D distance to native state 0.4 0.5 0.6 0.7 0.8 0.9 α Dynamic glass transition When is too small, BP is trapped in a glass phase

Performance of BP : phase diagram L 1 1 BP 0.8 0.6 α 0.4 0.2 α L1 (ρ) α rbp (ρ) S-BEP α = ρ 0 0 0.2 0.4 0.6 0.8 1 ρ

Step 3: design the measurement matrix in order to get around the glass transition Getting around the glass trap: design the matrix F so that one nucleates the naive state (crystal nucleation idea,...borrowed from error correcting codes!) Felström-Zigangirov, Kudekar Richardson Urbanke, Hassani Macris Urbanke,...! «Seeded BP»: A seed to nucleate the crystal

A construction inspired by the spatially coupled matrices developed in coding theory cf: Urbanke et al. Mixed mean-field and one-dimensional system: 1) Create many mean-field sub-systems subsystem 1 subsystem 2 subsystem 3 subsystem 4 subsystem 5

Mixed mean-field and one-dimensional system: 2) Add a first neighbor coupling subsystem 1 subsystem 2 subsystem 3 subsystem 4 subsystem 5

Mixed mean-field and one-dimensional system: subsystem 1 3) Choose parameters such that the first system is in the region of the phase diagram where there is no metastability subsystem 2 subsystem 3 subsystem 4 subsystem 5 large value of α Seed low values of α On average, α is still low!

Mixed mean-field and one-dimensional system: 4) The solution will appear in the first sub-system (with large α), and then propagate in the system subsystem 1 subsystem 2 subsystem 3 subsystem 4 subsystem 5 Seed

Numerical study Mean square error 0.4 0.3 0.2 0.1 t=1 t=4 t=10 t=20 t=50 t=100 t=150 t=200 t=250 t=300 0 5 10 15 20 Block index L = 20 =.4 N = 50000 J 1 = 20 J 2 =.2 1 =1 =.5

Numerical study Mean square error 0.4 0.3 0.2 0.1 t=1 t=4 t=10 t=20 t=50 t=100 t=150 t=200 t=250 t=300 t = 10 decoding of first block 0 5 10 15 20 Block index L = 20 =.4 N = 50000 J 1 = 20 J 2 =.2 1 =1 =.5

Numerical study Mean square error t = 10 decoding of first block 0.4 0.3 0.2 0.1 0 t=1 t=4 t=10 t=20 t=50 t=100 t=150 t=200 t=250 t=300 5 10 15 20 Block index t = 100 decoding of blocks 1 to 9 L = 20 =.4 N = 50000 J 1 = 20 J 2 =.2 1 =1 =.5

Seeded-BP : a threshold-reaching strategy L 1 1 BP seeded BP 0.8 0.6 α 0.4 0.2 α L1 (ρ) α rbp (ρ) S-BEP α = ρ 0 0 0.2 0.4 0.6 0.8 1 ρ L 1 Theory: seeded-bp threshold at = when L!1 phase transition line moves up when using seeding F

Optimal performance on artificial signals (sparse but with independent components).! More realistic signals?

L 1 BEP Compressed sensing : measure «random projections» and reconstruct the signal α = ρ! 0.24 3 Methods Linear relaxation Candès-Tao, Donoho, Mean field algorithm S-BEP α = 0.6 α = 0.5 α = 0.4 α = 0.3 α = 0.2 Mean field + special strategy to avoid glass Krzakala et al. glass transition Original image 128 2 = compression factor optimal compression (information-theory)

Bird s eye view Failure of the «representative agent» mean-field idea Impossible to describe spin glasses (or structural glasses) with a single effective spin : misses all the richness Go beyond effective medium for e.g. epidemic propagation (SIR) or gene expression : beyond the law of mass action «Whom or why does the effective agent represent?» (A. Kirman) ; Minority games Substituted by statistical approaches to the heterogeneous behaviors of individual agents

Bird s eye view «Representative agent» mean-field idea is substituted by statistical approaches to the heterogeneous behaviors of individual agents + simulations and algorithms More complicated («ordered phases», equilibration times,etc.), but much much richer. This «curse of dimensionality, poses several challenges to both technical and conceptual progress in neuroscience» ( disciplines (big data) Infinitely complex???

The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Two amusing remarks: Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Two amusing remarks: Physical spin glasses still far from well understood Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

The Replica Method, The Cavity Method, and all the conceptual, mathematical and physical challenges that appeared in the study of spin glasses have been a «source from which new knowledge and application have flowed profusely their invention 35 years ago Two amusing remarks: Physical spin glasses still far from well understood Useless spin glasses have found applications in fields that were totally unexpected Portrait of Ottavio Strada, Tintoretto, Venice 1567 Rijk s Museum Amsterdam * D. Sherrington

0 y 1 0 F 1 0 s 1 B @ Structured measurement matrix. Variances of the matrix elements F µi = 1 J 2 J 1 1 J 2 = J 1 J 2 0 1 B C J 1 1 J 2 C J 1 1 J 2 A @ J 1 1 J 2 A 0 J 1 1 J 2 J 1 1 : unit coupling : coupling J 1 B @ : coupling J 2 : no coupling (null elements) C A independent random Gaussian variables, zero mean and variance Jb(µ)b(i) /N

0 y 1 0 F 1 0 s 1 B @ Block 1 has a large value of M such that the solution arise in this block...... and then propagate sin the whole system! 1 J 2 J 1 1 J 2 = J 1 J 2 0 1 B C J 1 1 J 2 C J 1 1 J 2 A @ J 1 1 J 2 A 0 J 1 1 J 2 J 1 1 : unit coupling : coupling J 1 B @ : coupling J 2 : no coupling (null elements) C A L =8 N i = N/L M i = i N/L 1 > BP j = 0 < BP j 2 = 1 L ( 1 +(L 1) 0 )

An example: tomography of binary mixtures

An example: tomography of binary mixtures L measurements for each angle Synchrotron light

An example: tomography of binary mixtures L measurements for each angle Synchrotron light L angles: L 2 measurements L 2 pixels

An example: tomography of binary mixtures L angles: L 2 measurements L 2 pixels size 1 size 1/L If the size of domains is reconstruct with L 2 pixel: possible to measurements

An example: tomography of binary mixtures size 1 size 1/L If the size of domains is reconstruct with L 2 pixel: possible to measurements

An example: tomography of binary mixtures This picture, digitalized on grid, can be 1000 1000 reconstructed fom measurements with 16 angles size 1 size 1/L Gouillart et al., arxiv If the size of domains is reconstruct with L 2 pixel: possible to measurements