Neural Network Representation of Tensor Network and Chiral States

Similar documents
Tensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS

Quantum Convolutional Neural Networks

Fermionic topological quantum states as tensor networks

A Video from Google DeepMind.

arxiv: v1 [cond-mat.dis-nn] 18 Jan 2017

Matrix Product Operators: Algebras and Applications

arxiv: v1 [hep-th] 26 Sep 2017

Classification of Symmetry Protected Topological Phases in Interacting Systems

Topological Phases in One Dimension

Rigorous free fermion entanglement renormalization from wavelets

Matrix product states for the fractional quantum Hall effect

Machine Learning with Tensor Networks

Symmetric Surfaces of Topological Superconductor

Machine Learning with Quantum-Inspired Tensor Networks

Quantum Hamiltonian Complexity. Itai Arad

Universal Topological Phase of 2D Stabilizer Codes

Quantum Fields, Gravity, and Complexity. Brian Swingle UMD WIP w/ Isaac Kim, IBM

Quantum Information and Quantum Many-body Systems

Fermionic tensor networks

Storage of Quantum Information in Topological Systems with Majorana Fermions

Local Algorithms for 1D Quantum Systems

SPIN-LIQUIDS ON THE KAGOME LATTICE: CHIRAL TOPOLOGICAL, AND GAPLESS NON-FERMI-LIQUID PHASE

Quantum many-body systems and tensor networks: simulation methods and applications

Tensor network simulations of strongly correlated quantum systems

Cenke Xu. Quantum Phase Transitions between Bosonic Symmetry Protected Topological States without sign problem 许岑珂

The uses of Instantons for classifying Topological Phases

Integer quantum Hall effect for bosons: A physical realization

Preparing Projected Entangled Pair States on a Quantum Computer

Accelerated Monte Carlo simulations with restricted Boltzmann machines

Composite Dirac liquids

team Hans Peter Büchler Nicolai Lang Mikhail Lukin Norman Yao Sebastian Huber

Chiral Haldane-SPT phases of SU(N) quantum spin chains in the adjoint representation

Entanglement in Topological Phases

Topological classification of 1D symmetric quantum walks

Effective Field Theories of Topological Insulators

Time Reversal Invariant Ζ 2 Topological Insulator

Exploring Topological Phases With Quantum Walks

Universal phase transitions in Topological lattice models

Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d

Markov Chains for Tensor Network States

Topological Phases of Matter Out of Equilibrium

Renormalization of Tensor- Network States Tao Xiang

Matrix product approximations to multipoint functions in two-dimensional conformal field theory

Basics of quantum computing and some recent results. Tomoyuki Morimae YITP Kyoto University) min

Fermionic tensor networks

Lecture 3: Tensor Product Ansatz

3 Symmetry Protected Topological Phase

News on tensor network algorithms

4 Matrix product states

Defects in topologically ordered states. Xiao-Liang Qi Stanford University Mag Lab, Tallahassee, 01/09/2014

Classical Monte Carlo Simulations

Topological Insulators

3.14. The model of Haldane on a honeycomb lattice

Bell-like non-locality from Majorana end-states

Machine learning quantum phases of matter

Topology driven quantum phase transitions

Plaquette Renormalized Tensor Network States: Application to Frustrated Systems

MPS formulation of quasi-particle wave functions

Deep Learning Topological Phases of Random Systems

Detecting and using Majorana fermions in superconductors

Constructing Landau Formalism for Topological Order: Spin Chains and Ladders

Topological insulator (TI)

Field Theory Description of Topological States of Matter. Andrea Cappelli INFN, Florence (w. E. Randellini, J. Sisti)

2D Bose and Non-Fermi Liquid Metals

Majorana Fermions in Superconducting Chains

Magnets, 1D quantum system, and quantum Phase transitions

Frustration and Area law

Nematicity and quantum paramagnetism in FeSe

Fidelity susceptibility and long-range correlation in the Kitaev honeycomb model

Tensor network renormalization

Kitaev honeycomb lattice model: from A to B and beyond

Introduction to Tensor Networks: PEPS, Fermions, and More

Tensor network renormalization

SPT: a window into highly entangled phases

A new perspective on long range SU(N) spin models

Non-Abelian Anyons in the Quantum Hall Effect

Simulation of Quantum Many-Body Systems

Lecture notes on topological insulators

Topological Electromagnetic and Thermal Responses of Time-Reversal Invariant Superconductors and Chiral-Symmetric band insulators

Topological phases of SU(N) spin chains and their realization in ultra-cold atom gases

Topological Defects inside a Topological Band Insulator

Quantum simulation with string-bond states: Joining PEPS and Monte Carlo

Loop optimization for tensor network renormalization

Topology of electronic bands and Topological Order

The computational difficulty of finding MPS ground states

Topological order of a two-dimensional toric code

Simulation of Quantum Many-Body Systems

Field Theory Description of Topological States of Matter

It from Qubit Summer School

From Majorana Fermions to Topological Order

Experimental reconstruction of the Berry curvature in a topological Bloch band

Frustration-free Ground States of Quantum Spin Systems 1

Condensed Matter Physics in the City London, June 20, 2012

Boundary Degeneracy of Topological Order

Topological Kondo Insulators!

Quantum Phase Transitions

Frustration-free Ground States of Quantum Spin Systems 1

Multichannel Majorana Wires

Accelerating QMC on quantum computers. Matthias Troyer

Universal Quantum Simulator, Local Convertibility and Edge States in Many-Body Systems Fabio Franchini

Transcription:

Neural Network Representation of Tensor Network and Chiral States Yichen Huang ( 黄溢辰 ) 1 and Joel E. Moore 2 1 Institute for Quantum Information and Matter California Institute of Technology 2 Department of Physics University of California, Berkeley KITS Workshop on Machine Learning and Many-Body Physics July 6, 2017

Background In machine learning, neural network is the most widely used method. It is a variational approach in that the learning process is to perform optimization over the parameters in the neural network. Types of neural network: feedforward neural network, convolutional neural network, (restricted) Boltzmann machine, etc. Recently, Carleo and Troyer simulated quantum many-body systems with a variational approach based on restricted Boltzmann machines. The most popular variational approach in the study of many-body physics is based on tensor network states. It is important to compare and understand the connection between the representational power of neural versus tensor network states.

Boltzmann machine (BM) {s 1,s 2,...,s V + H } {±1} ( V + H ) e V + H j=1 h j s j V + H j,k=1 w jk s j s k {s 1, s 2,..., s V } V = {v 1, v 2,... v V } is the set of visible units carrying classical Ising variables s 1, s 2,... s V. H = {h 1, h 2,... h H } is the set of hidden units carrying s V +1, s V +2,..., s V + H. Figure taken from https://en.wikipedia.org/ wiki/boltzmann_machine

Locality {s 1,s 2,...,s V + H } {±1} ( V + H ) e V + H j=1 h j s j V + H j,k=1 w jk s j s k {s 1, s 2,..., s V } A restricted Boltzmann machine (RBM) is a BM such that w jk = 0 for any edge that connects a visible unit j V with a hidden unit k V + 1. A (restricted) Boltzmann machine is local if there are only short-range connections. Figure taken from Deng et al. PRX 7, 021021, 2017.

References and related work G. Carleo and M. Troyer. Solving the quantum many-body problem with artificial neural networks. Science 355, 602, 2017.

References and related work G. Carleo and M. Troyer. Solving the quantum many-body problem with artificial neural networks. Science 355, 602, 2017. D.-L. Deng, X. Li, and S. Das Sarma. Exact machine learning topological states. arxiv:1609.09060; Quantum entanglement in neural network states. Phys. Rev. X 7, 021021, 2017. J. Chen, S. Cheng, H. Xie, L. Wang, and T. Xiang. On the equivalence of restricted Boltzmann machines and tensor network states. arxiv:1701.04831. X. Gao and L.-M. Duan. Efficient representation of quantum many-body states with deep neural networks. arxiv:1701.05039. Y. Huang and J. E. Moore. Neural network representation of tensor network and chiral states. arxiv:1701.06246.

Motivations We study the representational power of Boltzmann machines. In terms of representational power, how are neural network states compared with tensor network states? Do all tensor network states have neural network representations? How much computational time does it take to convert a tensor network state into a neural network state? How much space does such a neural network representation occupy? Can we find a physically interesting class of states, which have neural network representations, but are hard (or even impossible) to describe using tensor networks?

Neural and tensor network representations We provide a polynomial-time algorithm that constructs a (local) neural network representation for any (local) tensor network state. 1 The construction is almost optimal: the number of parameters in the neural network representation is almost linear in the number of nonzero parameters in the tensor network representation. 1 See X. Gao and L.-M. Duan, arxiv:1701.05039 for a similar result. 2 N. Le Roux and Y. Bengio. Neural Comput. 20, 1631, 2008.

Neural and tensor network representations We provide a polynomial-time algorithm that constructs a (local) neural network representation for any (local) tensor network state. 1 The construction is almost optimal: the number of parameters in the neural network representation is almost linear in the number of nonzero parameters in the tensor network representation. Example: Any n-qubit matrix product state of bond dimension D has a local neural network representation with 2nD 2 hidden units and 4nD 2 log 2 D parameters. 1 See X. Gao and L.-M. Duan, arxiv:1701.05039 for a similar result. 2 N. Le Roux and Y. Bengio. Neural Comput. 20, 1631, 2008.

Neural and tensor network representations We provide a polynomial-time algorithm that constructs a (local) neural network representation for any (local) tensor network state. 1 The construction is almost optimal: the number of parameters in the neural network representation is almost linear in the number of nonzero parameters in the tensor network representation. Example: Any n-qubit matrix product state of bond dimension D has a local neural network representation with 2nD 2 hidden units and 4nD 2 log 2 D parameters. Using the universal approximation theorem, we convert every tensor in the network to a RBM. Then, we contract these RBMs in the same way as the tensors are contracted. Universal approximation theorem 2 : Any tensor M can be arbitrarily well approximated by a RBM, provided that the number of hidden units is the number of nonzero elements in M. 1 See X. Gao and L.-M. Duan, arxiv:1701.05039 for a similar result. 2 N. Le Roux and Y. Bengio. Neural Comput. 20, 1631, 2008.

Tensor network representation of chiral states Early attempts did not impose locality 3 or just target at expectation values of local observables rather than the wave function. 4 Recent progress 5 shows that local tensor networks can describe chiral topological states, but the examples there are all gapless. Dubail and Read s no-go theorem: For any chiral local free-fermion tensor network state, any local parent Hamiltonian is gapless. 3 Z.-C. Gu, F. Verstraete, and X.-G. Wen. arxiv:1004.2563 4 B. Beri and N. R. Cooper. Phys. Rev. Lett. 106, 156401, 2011. 5 T. B. Wahl, H.-H. Tu, N. Schuch, and J. I. Cirac. Phys. Rev. Lett. 111, 236805, 2013; J. Dubail and N. Read. Phys. Rev. B 92, 205307, 2015.

Tensor network representation of chiral states Early attempts did not impose locality 3 or just target at expectation values of local observables rather than the wave function. 4 Recent progress 5 shows that local tensor networks can describe chiral topological states, but the examples there are all gapless. Dubail and Read s no-go theorem: For any chiral local free-fermion tensor network state, any local parent Hamiltonian is gapless. It seems difficult to obtain a local tensor network representation for gapped chiral topological states. This is an open problem in the community. 3 Z.-C. Gu, F. Verstraete, and X.-G. Wen. arxiv:1004.2563 4 B. Beri and N. R. Cooper. Phys. Rev. Lett. 106, 156401, 2011. 5 T. B. Wahl, H.-H. Tu, N. Schuch, and J. I. Cirac. Phys. Rev. Lett. 111, 236805, 2013; J. Dubail and N. Read. Phys. Rev. B 92, 205307, 2015.

Fermionic neural network states ( dξ V +1 dξ V +2 dξ V + H e V + H j,k=1 w jk ξ j ξ k ) 0 V = {v 1, v 2,... v V } is the set of visible units carrying Grassmann numbers ξ 1, ξ 2,... ξ V. H = {h 1, h 2,... h H } is the set of hidden units carrying ξ V +1, ξ V +2,..., ξ V + H. We identify ξ 1, ξ 2,... ξ V with c 1, c 2,..., c V. Locality in fermionic BMs can be defined in the same way as in spin systems.

p + ip superconductor H = x Z 2 (c x+ i c x + c x+ j c x + c x+ i c x + ic x+ j c x + h.c.) 2µ x Z 2 c x c x = BZ d kh k, H k = k c k c k + k c k c k + 2M k c k c k k = sin k x + i sin k y, M k = cos k x + cos k y µ E k = 2 k 2 + M 2 1 2 BZ, g.s. = e d 2 k k c c E k 2M k k k 0 k This model represents topological superconductors with opposite chirality for 2 < µ < 0 and 0 < µ < 2, respectively.

Neural network representation of chiral states Approximate representations are acceptable, and we have to work with a finite system size in order to rigorously quantify the error of the approximation and the succinctness of the representation. Example: A gapped ground state in a chain of n spins can be approximated (in the sense of 99% fidelity) by a matrix product state with bond dimension 2Õ(log3/4 n). 6 6 I. Arad, A. Kitaev, Z. Landau, and U. Vazirani. arxiv:1301.1162.

Neural network representation of chiral states Approximate representations are acceptable, and we have to work with a finite system size in order to rigorously quantify the error of the approximation and the succinctness of the representation. Example: A gapped ground state in a chain of n spins can be approximated (in the sense of 99% fidelity) by a matrix product state with bond dimension 2Õ(log3/4 n). 6 On a square lattice of size n n, we construct a O(log n)-local neural network state n.n.s. with one visible and one hidden unit per site such that n.n.s g.s. > 1 1/ poly n. The same result can be established for the ground state of any translationally invariant gapped free-fermion systems. 6 I. Arad, A. Kitaev, Z. Landau, and U. Vazirani. arxiv:1301.1162.

Summary We provide a polynomial-time algorithm that constructs a (local) neural network representation for any (local) tensor network state. The construction is almost optimal in the sense that the number of parameters in the neural network representation is almost linear in the number of nonzero parameters in the tensor network representation.

Summary We provide a polynomial-time algorithm that constructs a (local) neural network representation for any (local) tensor network state. The construction is almost optimal in the sense that the number of parameters in the neural network representation is almost linear in the number of nonzero parameters in the tensor network representation. Despite the difficulty of representing (gapped) chiral topological states with local tensor networks, we construct a quasi-local neural network representation for a chiral p-wave superconductor. This is an explicit and physically interesting example that neural network states may go beyond tensor network states.

Outlook Numerical study of chiral topological order with neural network states. 7 The parton approach may allow us to construct quasi-local neural network representation for strongly interacting chiral topological states (e.g., the fractional quantum Hall state). What is the representational power of neural network states based on RBM? 7 I. Glasser et al. The geometry of neural network states, string-bond states and chiral topological order. preceding talk in this workshop.