Introduction to tensor network state -- concept and algorithm. Z. Y. Xie ( 谢志远 ) ITP, Beijing

Similar documents
Renormalization of Tensor Network States

Loop optimization for tensor network renormalization

Renormalization of Tensor- Network States Tao Xiang

Tensor network renormalization

Introduction to Tensor Networks: PEPS, Fermions, and More

The density matrix renormalization group and tensor network methods

Scale invariance on the lattice

Tensor network methods in condensed matter physics. ISSP, University of Tokyo, Tsuyoshi Okubo

Renormalization of Tensor Network States. Partial order and finite temperature phase transition in the Potts model on irregular lattice

Quantum many-body systems and tensor networks: simulation methods and applications

Lecture 3: Tensor Product Ansatz

Machine Learning with Quantum-Inspired Tensor Networks

Tensor network renormalization

It from Qubit Summer School

4 Matrix product states

Quantum Convolutional Neural Networks

Machine Learning with Tensor Networks

c 2017 Society for Industrial and Applied Mathematics

Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d

Efficient time evolution of one-dimensional quantum systems

Fermionic topological quantum states as tensor networks

arxiv: v1 [quant-ph] 18 Jul 2017

Simulation of Quantum Many-Body Systems

Matrix Product States

3 Symmetry Protected Topological Phase

Fermionic tensor networks

Tensor network simulations of strongly correlated quantum systems

Fermionic tensor networks

Quantum simulation with string-bond states: Joining PEPS and Monte Carlo

Holographic Branching and Entanglement Renormalization

Disentangling Topological Insulators by Tensor Networks

Advanced Computation for Complex Materials

The Density Matrix Renormalization Group: Introduction and Overview

Journal Club: Brief Introduction to Tensor Network

News on tensor network algorithms

arxiv: v2 [quant-ph] 31 Oct 2013

An introduction to tensornetwork

Simulating Quantum Systems through Matrix Product States. Laura Foini SISSA Journal Club

Excursion: MPS & DMRG

Numerical Methods for Strongly Correlated Systems

CLASSIFICATION OF SU(2)-SYMMETRIC TENSOR NETWORKS

Holographic Geometries from Tensor Network States

Rigorous free fermion entanglement renormalization from wavelets

Time-dependent variational principle for quantum many-body systems

Symmetry protected topological phases in quantum spin systems

Time-dependent DMRG:

arxiv: v2 [cond-mat.str-el] 18 Oct 2014

Time Evolving Block Decimation Algorithm

Matrix product states for the fractional quantum Hall effect

Tensor Networks, Renormalization and Holography (overview)

Quantum Fields, Gravity, and Complexity. Brian Swingle UMD WIP w/ Isaac Kim, IBM

Entanglement spectrum and Matrix Product States

Simulation of Quantum Many-Body Systems

2D tensor network study of the S=1 bilinear-biquadratic Heisenberg model

Numerical Linear and Multilinear Algebra in Quantum Tensor Networks

Quantum Many Body Systems and Tensor Networks

Entanglement and variational methods for strongly correlated quantum many-body systems

Shunsuke Furukawa Condensed Matter Theory Lab., RIKEN. Gregoire Misguich Vincent Pasquier Service de Physique Theorique, CEA Saclay, France

Quantum Information and Quantum Many-body Systems

Quantum Hamiltonian Complexity. Itai Arad

Tensor operators: constructions and applications for long-range interaction systems

Frustration-free Ground States of Quantum Spin Systems 1

arxiv: v3 [cond-mat.str-el] 15 Sep 2015

Matrix-Product states: Properties and Extensions

Efficient Representation of Ground States of Many-body Quantum Systems: Matrix-Product Projected States Ansatz

Matrix Product Operators: Algebras and Applications

Positive Tensor Network approach for simulating open quantum many-body systems

Noise-resilient quantum circuits

T ensor N et works. I ztok Pizorn Frank Verstraete. University of Vienna M ichigan Quantum Summer School

IPAM/UCLA, Sat 24 th Jan Numerical Approaches to Quantum Many-Body Systems. QS2009 tutorials. lecture: Tensor Networks.

arxiv: v2 [cond-mat.str-el] 28 Apr 2010

Preparing Projected Entangled Pair States on a Quantum Computer

Classical Monte Carlo Simulations

Quantum quenches in 2D with chain array matrix product states

arxiv: v2 [cond-mat.str-el] 6 Nov 2013

Plaquette Renormalized Tensor Network States: Application to Frustrated Systems

arxiv: v1 [cond-mat.str-el] 24 Sep 2015

Tensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS

Real-Space RG for dynamics of random spin chains and many-body localization

arxiv: v1 [quant-ph] 6 Jun 2011

arxiv: v4 [cond-mat.stat-mech] 13 Mar 2009

Z2 topological phase in quantum antiferromagnets. Masaki Oshikawa. ISSP, University of Tokyo

Topological order from quantum loops and nets

Quantum Entanglement in Exactly Solvable Models

Quantum spin systems - models and computational methods

Scaling analysis of snapshot spectra in the world-line quantum Monte Carlo for the transverse-field Ising chain

Fully symmetric and non-fractionalized Mott insulators at fractional site-filling

arxiv: v1 [cond-mat.str-el] 7 Aug 2011

Many-Body Fermion Density Matrix: Operator-Based Truncation Scheme

Quantum s=1/2 antiferromagnet on the Bethe lattice at percolation I. Low-energy states, DMRG, and diagnostics

Quantum Lattice Models & Introduction to Exact Diagonalization

Simulations of Quantum Dimer Models

Entanglement in Valence-Bond-Solid States on Symmetric Graphs

Entanglement Entropy In Gauge Theories. Sandip Trivedi Tata Institute of Fundamental Research, Mumbai, India.

Chiral spin liquids. Bela Bauer

Non-magnetic states. The Néel states are product states; φ N a. , E ij = 3J ij /4 2 The Néel states have higher energy (expectations; not eigenstates)

From Path Integral to Tensor Networks for AdS/CFT

Momentum-space and Hybrid Real- Momentum Space DMRG applied to the Hubbard Model

Machine learning quantum phases of matter

M AT R I X P R O D U C T S TAT E S F O R L AT T I C E G A U G E T H E O R I E S. kai zapp

Transcription:

Introduction to tensor network state -- concept and algorithm Z. Y. Xie ( 谢志远 ) 2018.10.29 ITP, Beijing

Outline Illusion of complexity of Hilbert space Matrix product state (MPS) as lowly-entangled state Tensor network state (TNS) as lowly-entangled state Partition function as tensor networks Brief review of RG techniques to evaluate a 2D tensor network Partial list of successful applications of TNS

Why need Tensor Network State (TNS)? Hilbert space is too large N Catastrophe of dimension, Exponential wall problem Dirac: too complicated to be solved Kohn: wavefunction is not a scientific concept

Why we need Tensor Network State (TNS)? Nature does not exhaust all the possibilities 1024 * 1024 binary figure: can hold absolutely anything in the universe actually much much more: 10 315653 Almost all of them are meaningless and you will never see

Why we need Tensor Network State (TNS)? Nature is meaningful and usually local Particles are uncorrelated at long distance Entanglement entropy Area law conjecture: all physical relevant system satisfies (may have log corrections in gapless/critical systems) sys env

Why we need Tensor Network State (TNS)? From the viewpoint of entanglement Full Hilbert space: Too large to study even to enumerate! TNS is suitable for lowly-entangled state Lowly-entangled corner: relevant for physical QMB state, area law

Graphical Notations Open links Shared links State and operator State norm Expectation value

1D TNS: Matrix Product State (MPS) Graphics and Expression Some exact states: GHZ, Majumdar-Ghosh, AKLT Matrix Product Operator (MPO): Heisenberg Key: 1D area law, always finitely-correlated Canonical form

1D TNS: Matrix Product State (MPS) Energy minimization find a PEPS which minimize the energy: Imaginary time evolution

2D Tensor Network State Graphical notations (basis always there) Scalar (in PBC) Operator parameterize State: virtual / physical

Graphics and Expression 2D Tensor Network State Norm/expectation

2D Tensor Network State Members: PEPS, PESS (virtual particle entanglement) entangled pair entangled simplex Projected Entangled Pair States (PEPS) arxiv: cond-mat 0407066 Projected Entangled Simplex States (PESS) PRX 4, 011025 (2014)

2D Tensor Network State Members: Correlator Product State (CPS), e.g., H. J. Changlani, PRB 80, 245116 (2009) Product of correlators, no virtual particle

2D Tensor Network State Members: TTN, MERA G. Vidal, PRL 99, 220405 (2007) Tree Tensor Network State (TNN) Multi-scale Entangled Renormalization Ansatz (MERA) RG: Kadanoff, Fisher, Wilson

2D Tensor Network State Different classes indicate different entanglement structure 1D area law: TNN Area Law summary 1D area law with log: 1D MERA 2D area law: PEPS, PESS, Short-CPS, 2D MERA 2D area law with log(even volume law): Long-CPS, Branch MERA

2D Tensor Network State Some exact states: RVB, Toric Code, SSS (1). Represent a spin-2 in terms of 2 vspin-1. (2). Vspins in a simplex form simplex singlet (3). Project virtual space to the physical spin-2 space:

2D Tensor Network State Power-law-decaying correlation = Ψ Ψ

2D Tensor Network State 2D-iTEBD: simple update Local vision!

2D Tensor Network State Fermionic statistics P. Corboz, PRL 113, 046402 (2014)

Partition function as a tensor-network Tensor network model (TNM) Any statistical model with only local interactions has an exact tensor network representation of the partition function = T

Partition function as a tensor-network Other construction: group, dual

What we have discussed up to now? For quantum lattice systems Choose suitable ansatz (e.g., 5 classes) Obtain wavefunction (e.g., 2 big classes) Obtain expectation (e.g., 3 big classes) For classical statistical systems Mapping to TNM (e.g., 3 methods) Obtain expectation (e.g., 3 big classes)

RG methods to evaluate a tensor-network It is a #P-complete hard problem to do it exactly in 2D! Strategy: approximation via renormalization (information compression) Transform Matrix (TM) Renormalization Group (RG) Target: effective low-dimensional representation of T, and then diagonalize it. XQW, Phys. Rev. B 56, 5061 (1997) Basis Transformation to get the fixed point

RG methods to evaluate a tensor-network Boundary MPS: infinite Time Evolving Block Decimation (itebd) Target: effective MPS representation of the dominant eigenvector Power method to get the fixed point: R. Orus, Phys. Rev. B 78, 155117 (2008)

RG methods to evaluate a tensor-network Corner Transfer Matrix (CTM) RG Target: effective representation of the surrounding environment A A

RG methods to evaluate a tensor-network Corner Transfer Matrix (CTM) RG: Left move P. Corboz, PRL 113, 046402 (2014) C 1 E 3 E 3 C 2 A E 1 E 1 A A A A E 2 E 2 make matrix C 3 E 4 E 4 C 4 Enlarge the corner by absorbing system gradually to get the fixed point Truncation by bond targeting: always 4*4 cluster

RG methods to evaluate a tensor-network Coarse-graining RG: Mimic Kadanoff s block spin decimation in real space by scale transformation. The local DOF is renormalized and decimated in each scale transformation

Local optimization Levin-Nave Tensor RG: realized by local SVD 8/43 Step 1: lattice deformation M kj, il mji mlk m D n 1 T T U V kj, n n il, n SVD: the best scheme to truncate a matrix

Step 2 of LN-TRG: Decimation of local degree of freedom by summation 9/43 Txyz SxikSyjiSzkj ijk A complete scale transformation step: from N to N/3

Higher-order TRG (HOTRG): realized by local HOSVD Coarse grain along the lattice vectors alternatively PRB 86, 045139 (2012) Key Problem: 2 cutoff simultaneously, how?? D D D 2 D 2

Low rank approximation of a tensor is still an open problem itself! Higher-Order SVD (HOSVD) SIAM, J. Matrix Anal. Appl, 21, 1253 (2000). Definition: pseudo-diagonalization by orthogonal transformation Truncation: gives a good low rank approximation of T. Us Can be obtained by directional SVD independently

In practical calculation: isometry action U: obtained from HOSVD of M, chosen the one has smaller truncation error Two directions are coarse-grained alternatively to cover more entanglement and keep symmetry. More symmetric method: bond SVD via gauge transformation

Improvement 1: Second Renormalization Group (SRG) by global optimization PRL 103, 160601 (2009) Difference between NRG and DMRG: different basis selection scheme! 1974 Wilson NRG System 1992 White DMRG System sys env States are weighted according to the spectra of the system States are weighted according to the spectra of sys s RDM: entanglement spectra between a sys and its env Environment is important!

Note: HOTRG is a local update method without environment HOSRG: Consider the environment in the frame work of HOTRG Z=Tr MM env Forward iteration: HOTRG to obtain U at all the scales. Backward iteration: get E (N-1), E (N-2),, E (2), E (1) from the recursive relation. Sweep: This iteration can be repeated to gain more accuracy

Extension to 3D Forward iteration Backward iteration Modify the local decomposition SRG can be used to globally optimize a 3D tensor network! SRG in 2D finite system with PBC: combined with a sweeping scheme H. H. Zhao, Z. Y. Xie, T. Xiang, and M. Imada, Phys. Rev. B 93, 125115 (2016).

Improvement 2: EV/Loop-TRG by removing local(short-range) entanglement Corner double line picture TEFR: ZCGu, XGWen, PRB 80, 155131(2009) No long-range entanglement at all Topological trivial phase Long-range entanglement exist Probably topological ordered phase RG fixed tensor: low-rank Direct-product structure: can be removed locally; If done, then D can be much smaller without lowering the accuracy! Corresponds to RG spirit: low-scale entanglement should not appear at higher-scale near Tc

EV-TRG, i.e., realized by disentangler Idea: why we did not see CDL? they are entangled by local entanglement (as unitary trans.) Spirit of MERA: we add disentangler before we do local decomposition. Evenbly, Vidal, PRL 115, 180405 (2015) equivalently to enforce this special structure of the basis transformation unitary disentangle (to remove local entanglement) isometry (to do decimation)

Solve the parameters by variation: Minimize distance, or maximize overlap Isometric property can simplify the calculation: for any isometry X and any matrix M, we have

Loop TRG: realized by entanglement filtering Idea: CDL structure can be removed by canonicalization of a very sparse MPS (loop) Canonicalization (basis transformation): to remove all the loop entanglement Deformation (square to octagon): to coarse grain Variation: to improve accuracy SY, ZCGu, XGWen, PRL 118, 110504 (2017)

Comparison Method Summary Local opt CG, global opt CG, transfer-matrix-based (best). Coarse-graining: more suitable to extract critical info from RG fixed point HOTRG/HOSRG: designed for 3D lattice, almost the only practical method in 3D SRG/HOSRG: can work for finite lattice (even PBC), while others has problems Entanglement removing: more suitable for critical system

Critical point accuracy Method Summary Remove SRE, but why (not so clear)? Variation? disentangle/filtering? No + No Yes + Yes EV-TRG and Loop-TRG use variation, while others do not.

What we have discussed up to now? For quantum lattice systems Choose suitable ansatz (e.g., 5 classes) Obtain wavefunction (e.g., 2 big classes) Obtain expectation (e.g., 3 big classes) For classical statistical systems Mapping to TNM (e.g., 3 methods) Obtain expectation (e.g., 3 big classes)

Successful Applications (Very Partial) 3D classical statistical model: Ising model PRB 86, 045139 (2012) Frustrated spin model: AF Kagome, J1-J2 square PRL 118, 137202 (2017) Unfrustrated spin model: many, spin-1/2 AF Heisenberg model on square Annu. Rev. CMP 3, 111(2012) Superconductivity: t-j model, Hubbard model PRL 113, 046402 (2014) Classical spin glass: EA model PRB 90, 174201 (2014) Quantum chemistry: Nat. Chem. 5, 660 (2013) Continuous DOF and KT phase transition: Continuous space and 1D quantum field theory: PRE 89, 013308 (2014) PRL 104, 190405 (2010) 1D many-body localization: Topological order detection: PRL 114, 170505 (2015) PRL 111, 107205 (2013) Lattice gauge theory: PRD 88, 056005 (2013)

Machine Learning? Thank you!