Yukawa Institute for Theoretical Physics, Kyoto University, Japan June, 6 Loop optimization for tensor network renormalization Shuo Yang! Perimeter Institute for Theoretical Physics, Waterloo, Canada Zheng-Cheng Gu (CUHK, PI) Xiao-Gang Wen (MIT, PI) arxiv:5.98
Overview tensor network + renormalization group = tensor network renormalization! Partition function of classical statistical system!!! Euclidean path integral of D quantum system!!! Physical properties of D quantum system e h Z = X { } exp( X hiji i j)
Overview tensor network + renormalization group = tensor network renormalization boundary MPS fully isotropic coarse-graining
Overview tensor network + renormalization group = tensor network renormalization How the physics change with scale scale transformation Non-perturbative approach, suitable for strongly correlated systems Reproduce long-range physics
Overview tensor network + renormalization group = tensor network renormalization How the physics change with scale scale transformation Non-perturbative approach, suitable for strongly correlated systems Reproduce long-range physics RG flow Aim! Remove short-range entanglement / correlations Generate proper RG flow & correct fixed points Recover scale invariance at criticality A C B
Real space renormalization group D quantum: real space NRG entanglement DMRG S. White (99) Real space analogue of NRG Wilson (975) breakdown for particle in a box problem D classical: block spin entanglement TRG Levin & Nave (7) L.P. Kadanoff (966)
Tensor renormalization group Levin & Nave (7) LN-TNR Three steps. Deform tensors, make a truncation. Coarse graining. Renormalize tensors (multiply tensor by a constant factor)
Tensor renormalization group SVD singular value decomposition (SVD) keep only keep the largest keep SVD original keep = S S T U singular values cost function keep keep p = p V S keep = S keep = /
Drawbacks of LN-TNR Off criticality cannot remove conner-double-line (CDL) tensors cannot give the correct structures of fixed points D D D D = D B @ D D... D C A A C B
Drawbacks of LN-TNR At criticality cannot explicitly recover scale invariance cannot completely remove short-range entanglement
Drawbacks of LN-TNR At criticality cannot explicitly recover scale invariance cannot completely remove short-range entanglement Example classical Ising model partition function Z = X { } exp( X hiji i j)
Drawbacks of LN-TNR At criticality cannot explicitly recover scale invariance cannot completely remove short-range entanglement Example classical Ising model partition function Z = X { } exp( X i j) hiji local tensor T = T Ising u,l,d,r T Ising,,, = e, T Ising,,, = e, T Ising,,, = e, T Ising,,, = e, others =.
Drawbacks of LN-TNR Ising CFT central charge c =/ scaling dimensions I
Drawbacks of LN-TNR Ising CFT central charge c =/ scaling dimensions I Calculate scaling dimensions transfer matrix eigenvalues of the transfer matrix c, Zheng-Cheng Gu and Xiao-Gang Wen, Phys. Rev. B 8, 55 (9).
Drawbacks of LN-TNR Ising CFT central charge c =/ scaling dimensions I Calculate scaling dimensions transfer matrix eigenvalues of the transfer matrix c, Zheng-Cheng Gu and Xiao-Gang Wen, Phys. Rev. B 8, 55 (9). LN-TNR c, even, odd Loop-I, D= scaling dimensions change with RG step cannot explicitly recover scale invariance 5 5 RG step
Drawbacks of LN-TNR Ising CFT central charge c =/ scaling dimensions I Calculate scaling dimensions transfer matrix eigenvalues of the transfer matrix c, Zheng-Cheng Gu and Xiao-Gang Wen, Phys. Rev. B 8, 55 (9). LN-TNR c, even, odd Loop-I, D= scaling dimensions change with RG step cannot explicitly recover scale invariance high-index parts will destroy low-index parts accuracy is fine, stability is bad 5 5 RG step
Improvements of TRG off-critical critical TRG (Levin & Nave, 6) SRG (Xie, Jiang, Weng, Xiang, 8) TEFR (Gu & Wen, 9) TNR (Evenbly & Vidal, ) Loop-TRG! (current)
Improvements of TRG off-critical critical TRG (Levin & Nave, 6) SRG (Xie, Jiang, Weng, Xiang, 8) TEFR (Gu & Wen, 9) TNR (Evenbly & Vidal, ) Loop-TRG (current)
Renormalization Group Momentum space Tree level approximation no loop Real space Tree level approximation no loop Beyond tree level one loop Beyond tree level one loop
Renormalization Group Momentum space Tree level approximation no loop Real space Tree level approximation no loop momentum b ignored fast slow simply ignore the fast modes Beyond tree level one loop Beyond tree level one loop
Renormalization Group Momentum space Tree level approximation no loop Real space Tree level approximation no loop momentum b ignored fast slow simply ignore the fast modes Beyond tree level one loop Beyond tree level one loop momentum b ignored fast slow integrate over the fast modes
Renormalization Group Momentum space Tree level approximation no loop Real space Tree level approximation no loop momentum b ignored fast slow simply ignore the fast modes LN-TNR remove short-range entanglement by a local SVD Beyond tree level one loop Beyond tree level one loop momentum b ignored fast slow integrate over the fast modes
Renormalization Group Momentum space Tree level approximation no loop Real space Tree level approximation no loop momentum b ignored fast slow simply ignore the fast modes LN-TNR remove short-range entanglement by a local SVD Beyond tree level one loop Beyond tree level one loop momentum b ignored fast slow integrate over the fast modes LN-TNR +loop optimization further remove short-range entanglement inside a loop! This work!
Algorithms of Loop-TNR
Algorithms of Loop-TNR Part One: Entanglement filtering
Algorithms of Loop-TNR Part One: Entanglement filtering Part Two: Optimizing tensors on a loop
Algorithms of Loop-TNR Part One: Entanglement filtering Together: Complete remove short-range entanglement Part Two: Optimizing tensors on a loop
Part One Entanglement filtering How. Find & insert projectors. Define new tensors 8 7 6 5 5 6 7 8 P L P L P L P L T T T T P R P R P R P R
Part One Entanglement filtering How. Find & insert projectors. Define new tensors 8 7 6 5 5 6 7 8 P L P L P L P L T T T T P R P R P R P R Aim Remove conner double line (CDL) tensors Generate local canonical gauge Zheng-Cheng Gu and Xiao-Gang Wen, Phys. Rev. B 8, 55 (9).
Part Two Optimizing tensors on a loop
Part Two Optimizing tensors on a loop cost function LN-TNR Loop-TNR
Part Two Optimizing tensors on a loop solve the linear equation
Part Two Optimizing tensors on a loop
Part Two Optimizing tensors on a loop Loop-TNR Results L =
Part Two Optimizing tensors on a loop Loop-TNR Results scaling dimensions does not change with scale ~ scale invariance a clear gap between high-level parts and low-level parts L =
Stability c, even, odd (a) LN-TNR =6, L= (b) Loop-TNR, =6, L= 5 5 c, even, odd (c) LN-TNR =, L= (d) Loop-TNR, =, L= 5 5 6 (e) LN-TNR, =6, L= 6 (f) Loop-TNR, =6, L= c, even, odd 5 5 5 5 Iteration step Iteration step
Stability c, even, odd (a) LN-TNR =6, L= (b) Loop-TNR, =6, L= = 6 remain accurate up to iteration steps 5 5 c, even, odd (c) LN-TNR =, L= 5 5 (d) Loop-TNR, =, L= even longer for = the proper RG flow last longer for larger 6 (e) LN-TNR, =6, L= 6 (f) Loop-TNR, =6, L= c, even, odd 5 5 5 5 Iteration step Iteration step
Stability c, even, odd (a) LN-TNR =6, L= (b) Loop-TNR, =6, L= 5 5 c, even, odd (c) LN-TNR =, L= (d) Loop-TNR, =, L= increasing, more scaling dimensions can be resolved 5 5 6 (e) LN-TNR, =6, L= 6 (f) Loop-TNR, =6, L= c, even, odd 5 5 5 5 Iteration step Iteration step
Stability c, even, odd (a) LN-TNR =6, L= (b) Loop-TNR, =6, L= effectively, = 6 5 5 c, even, odd (c) LN-TNR =, L= (d) Loop-TNR, =, L= c, even, odd 6 5 5 5 (e) LN-TNR, =6, L= 6 5 (f) Loop-TNR, =6, L= effectively, = 6 = 56 much more scaling dimensions can be read off accuracy is higher 5 5 Iteration step Iteration step
Stability c, even, odd (a) LN-TNR =6, L= (b) Loop-TNR, =6, L= effectively, = 6 5 5 c, even, odd (c) LN-TNR =, L= (d) Loop-TNR, =, L= c, even, odd 6 5 5 5 (e) LN-TNR, =6, L= 5 5 Iteration step 6 5 (f) Loop-TNR, =6, L= Iteration step effectively, = 6 = 56 much more scaling dimensions can be read off accuracy is higher infinite ~ infinite dimensional fixed point tensor described by Ising CFT
Accuracy
Relative error of the free energy per site f (Tc) - -5-6 -7-8 -9 - Loop-TNR (a) Critical LN-TNR 5 6 f (T) - -5-7 (b) Off-critical, =8 LN-TNR Loop-TNR -9.9.95..5. T/T c
Relative error of the free energy per site At critical point, the error of Loop-TNR decays much faster than the error of LN-TNR f (Tc) - -5-6 -7-8 -9 - Loop-TNR (a) Critical LN-TNR 5 6 f (T) - -5-7 (b) Off-critical, =8 LN-TNR Loop-TNR -9.9.95..5. T/T c almost decays exponentially with bond dimension
Relative error of the free energy per site At critical point, the error of Loop-TNR decays much faster than the error of LN-TNR f (Tc) - -5-6 -7-8 -9 - Loop-TNR (a) Critical LN-TNR 5 6 f (T) - -5-7 (b) Off-critical, =8 LN-TNR Loop-TNR -9.9.95..5. T/T c almost decays exponentially with bond dimension At off-critical points, the errors almost remains a constant for all temperatures near the critical point
Loop TRG with reflection symmetry i i l j j k k l k j j k i i l l (a) (b) axis of symmetry global C symmetry (f) (e) (d) (c) fix the gauge (g) (h) j i k l T (i) S ideal case: explicitly invariant fixed point tensor S
Fixed point tensors LN-TNR absolute value LN-TNR absolute difference low-index parts are approximately invariant Loop-TNR absolute value Loop-TNR absolute difference all tensor elements are nearly invariant ~ scale invariance
Summary
Summary Real space TNR! Tree level: LN-TNR (Levin & Nave, 7) One loop: GW-TNR, EV-TNR, Loop-TNR
Summary Real space TNR! Tree level: LN-TNR (Levin & Nave, 7) One loop: GW-TNR, EV-TNR, Loop-TNR D algorithm LN-TNR + D algorithm itebd GW-TNR (Gu & Wen, 9) Loop-TNR Part One DMRG Varational MPS Loop-TNR Part Two MERA EV-TNR (Evenbly & Vidal, )
Thank you!