Machine Learning with Tensor Networks

Size: px
Start display at page:

Download "Machine Learning with Tensor Networks"

Transcription

1 Machine Learning with Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv: Beijing Jun 2017

2 Machine learning has physics in its DNA # " # # " " # " Boltzmann Machines Disordered Ising Model Deep Belief Networks The "Renormalization Group" P. Mehta and D.J. Schwab, arxiv:

3 Convolutional neural network "MERA" tensor network

4 Impact of tensor networks in physics Critical Phenomena Quantum Gravity High Precision Quantum Chemistry

5 Are tensor networks useful for machine learning? This Talk Tensor networks can compress weights of powerful machine learning models Benefits include Linear scaling Adaptive optimization Feature sharing

6 Prior tensor networks + machine learning Markov random field models Novikov et al., Proceedings of 31st ICML (2014) Large scale PCA Lee, Cichocki, arxiv: (2014) Feature extraction of tensor data Bengua et al., IEEE Congress on Big Data (2015) Compressing weights of neural nets Novikov et al., Advances in Neural Information Processing (2015)

7 What are Tensor Networks?

8 Original setting is quantum mechanics Spin model (Transverse field Ising model): z z h x Ĥ = X j ( z j z j+1 h x j ) " " " " " " " h<<1 " " " " " " "h>>1

9 Wavefunction just a rule to map spin configurations to numbers s 1 s 2 s 3 s 4 s 5 s 6 s 7 s 8 " # " " " " " " " # " " " " " " " # " # " " " " " " " " # " # " " # " # " " " " " " " " # " # " " # # # # " " " " # # # # " " " Simplest rule: store every amplitude separately

10 [ [ Let's make a different rule Introduce matrices, one for each spin " M " = [ # M # = [

11 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # v L

12 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # v M " L

13 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # v M " M # L

14 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # v M " M # M " L

15 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # v M " M # M " M " L

16 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # v M " M # M " M " M # L

17 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # M " M # M " M " M # v L v R

18 Compute amplitude by multiplying matrices together (with boundary vectors v L and v R ) " # " " # M " M # M " M " M # v L v R " " # # # v L M " M " M # M # M # v R " # # " " v L M " M # M # M " M " v R

19 This rule is called a matrix product state (MPS) s 1 s 2 s 3 s 4 = v L M s 1 M s 2 M s 3 M s 4 v R Matrices can vary from site to site Size of matrices called m (the "bond dimension") For m = 2 N/2 can represent any state of N spins Really just a way of compressing a big tensor

20 What have we gained? By representing a wavefunction as an MPS with small matrices (small bond dimension m) Then we've represented 2 N amplitudes using only (2 N m 2 ) parameters Efficient to compute properties of an MPS, or to optimize an MPS (DMRG algorithm)

21 MPS = matrix product state MPS come with powerful optimization techniques (DMRG algorithm) White, PRL 69, 2863 (1992) Stoudenmire, White, PRB 87, (2013)

22 Tensor Diagrams (Briefly)

23 Helpful to draw N-index tensor as blob with N lines s 1 s 2 s 3 s 4 s N s 1 s 2 s 3 s N =

24 Diagrams for simple tensors j v j i j M ij i j k T ijk

25 Joining lines implies contraction, can omit names X i j j M ij v j A ij B jk = AB A ij B ji =Tr[AB]

26 Matrix product state in diagram notation s 1 s 2 s 3 s 4 s 5 s 6 = X M s 1 1 M s M s M s M s M s 6 5 s 1 s 2 s 3 s 4 s 5 s 6 = Can suppress index names, very convenient

27 Matrix product state in diagram notation s 1 s 2 s 3 s 4 s 5 s 6 = X M s 1 1 M s M s M s M s M s 6 5 = Can suppress index names, very convenient

28 Besides MPS, other successful tensor are PEPS and MERA PEPS (2D systems) MERA (critical systems) Evenbly, Vidal, PRB 79, (2009) Verstraete, Cirac, cond-mat/ (2004) Orus, Ann. Phys. 349, 117 (2014)

29 Learning with Tensor Networks

30 Proposal: 1. Lift data to exponentially higher space (feature space = Hilbert space) 2. Apply linear classifier in feature space 3. Compress weights using a tensor network Following slides use feature map of Novikov et al. Novikov, Trofimov, Oseledets, "Exponential Machines", arxiv: Stoudenmire, Schwab, "Supervised Learning with Tensor Networks", arxiv:

31 Original / raw data vectors x =(x 1,x 2,x 3,...,x N ) Example of grayscale images, components of x are pixels x j 2 [0, 1]

32 1. Lift data to exponentially higher space (feature space = Hilbert space) x =(x 1,x 2,x 3,...,x N ) lift (x) =(1,x 1,x 2,x 3,... x 1 x 2, x 1 x 3, x 2 x 3,... x 1 x 2 x 3, x 1 x 2 x 4, x 1 x 2 x 3 x N ) singles pairs triples... N-tuple

33 2. Apply linear classifier in feature space f(x) =W (x) = X s W s1 s 2 s 3 s N x s 1 1 xs 2 2 xs 3 3 xs N N s j =0, 1 Weights are an N-index tensor Just like an N-site wavefunction

34 N=3 example: f(x) =W (x) = X s W s1 s 2 s x s xs 2 2 xs 3 3 = W W 100 x 1 + W 010 x 2 + W 001 x 3 + W 110 x 1 x 2 + W 101 x 1 x 3 + W 011 x 2 x 3 + W 111 x 1 x 2 x 3 Contains linear classifier, and various poly. kernels

35 3. Compress weights as a tensor network W s1 s 2 s 3 s N M s1 M s2 M s3 M sn Could also use MERA or PEPS instead of MPS

36 [ Tensor diagrams of the approach s 1 s 2 s 3 s 4 s 5 s 6 s N x (x) = s j = [ 1 x j

37 [ [ Tensor diagrams of the approach s 1 s 2 s 3 s 4 s 5 s 6 s N x (x) = s j = [ 1 x j Other choices include: [ 1 x j [ or x 2 j [ cos 2 x j sin 2 x j

38 Tensor diagrams of the approach f(x) = W (x) = W (x) (M s1 M s2 M sn ) s 1s 2 s N (x)

39 Linear scaling Can use algorithm similar to DMRG to optimize Scaling is N N T m 3 N = size of input N T = size of training set m = MPS bond dimension f(x) = W (x)

40 Linear scaling Can use algorithm similar to DMRG to optimize Scaling is N N T m 3 N = size of input N T = size of training set m = MPS bond dimension f(x) = W (x)

41 Linear scaling Can use algorithm similar to DMRG to optimize Scaling is N N T m 3 N = size of input N T = size of training set m = MPS bond dimension f(x) = W (x)

42 Linear scaling Can use algorithm similar to DMRG to optimize Scaling is N N T m 3 N = size of input N T = size of training set m = MPS bond dimension f(x) = W (x)

43 Linear scaling Can use algorithm similar to DMRG to optimize Scaling is N N T m 3 N = size of input N T = size of training set m = MPS bond dimension f(x) = W (x)

44 Linear scaling Can use algorithm similar to DMRG to optimize Scaling is N N T m 3 N = size of input N T = size of training set m = MPS bond dimension f(x) = W (x) Could improve with stochastic gradient

45 Linear scaling Model similar to kernel learning Can train without "kernel trick", avoiding its N 2 T scaling problem

46 Does the weight tensor obey an "area law"? More entangled than a ground-state wavefunction? Less entangled? Do an experiment to find out...

47 MNIST Experiment MNIST is a benchmark data set of grayscale handwritten digits (labels ` = 0,1,2,...,9) 60,000 labeled training images 10,000 labeled test images

48 MNIST Experiment Details: Shrink images from 28x28 14x14 Trained 10 models * to distinguish each digit, largest output is prediction Minimize quadratic cost function C = 1 N T N T X n=1 (W ` (x n ) yǹ) 2 + W 2

49 MNIST Experiment One-dimensional mapping

50 MNIST Experiment Results: Bond dimension m = 10 m = 20 m = 120 Test Set Error ~5% (500/10,000 incorrect) ~2% (200/10,000 incorrect) 0.97% (97/10,000 incorrect)

51 MNIST Experiment Demo

52 MNIST in friendly neighborhood of Hilbert space (feature space) W MNIST MPS

53 Situation for other data sets? Optimistic = typical W machine learning Pessimistic = tensor networks

54 Benefits of Tensor Network Models f(x) = W (x)

55 f(x) = W (x) Many interesting benefits of using tensor network weights. Two benefits: 1. Adaptive training 2. Feature sharing

56 { 1. Tensor networks are adaptive boundary pixels not useful for learning grayscale training data

57 2. Feature sharing Multi-class decision function f `(x) =W ` (x) Index ` runs over possible labels Predicted label is argmax` f `(x) ` f `(x) = W ` (x)

58 2. Feature sharing ` f `(x) = W ` (x) ` = Different central tensors "Wings" shared between models Regularizes models

59 2. Feature sharing ` f `(x) = Progressively learn shared features

60 2. Feature sharing ` f `(x) = Progressively learn shared features

61 2. Feature sharing ` f `(x) = Progressively learn shared features

62 2. Feature sharing ` f `(x) = Progressively learn shared features Deliver to central tensor `

63 Implications for quantum computing? Weights formally inhabit same space as quantum Hilbert space (space of wavefunctions) Negative Outlook! Weights have low entanglement, quantum computer not needed Positive Outlook " Quantum computer could train extremely expressive models Tensor networks equivalent to finite-depth quantum circuits...

64 Connections to Other Approaches Graphical models: like tensor networks but with positive weights (Rolfe, "Multifactor Expectation Maximization...") Weighted Finite Automata: like translation invariant MPS, trained with spectral method (Balle, "Spectral Learning..." Mach Learn (2014) 96:33-63) Neural Networks: "ConvAC" neural networks with linear activation and product pooling equivalent to tensor networks (Levine, "Deep Learning and Quantum Entanglement..." arxiv: )

65

Machine Learning with Quantum-Inspired Tensor Networks

Machine Learning with Quantum-Inspired Tensor Networks Machine Learning with Quantum-Inspired Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 RIKEN AICS - Mar 2017 Collaboration with David

More information

Supervised Learning with Tensor Networks

Supervised Learning with Tensor Networks Supervised Learning with Tensor Networks E. M. Stoudenmire Perimeter Institute for Theoretical Physics Waterloo, Ontario, N2L 2Y5, Canada David J. Schwab Department of Physics Northwestern University,

More information

Tensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS

Tensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS Tensor network vs Machine learning Song Cheng ( 程嵩 ) IOP, CAS physichengsong@iphy.ac.cn Outline Tensor network in a nutshell TN concepts in machine learning TN methods in machine learning Outline Tensor

More information

The density matrix renormalization group and tensor network methods

The density matrix renormalization group and tensor network methods The density matrix renormalization group and tensor network methods Outline Steve White Exploiting the low entanglement of ground states Matrix product states and DMRG 1D 2D Tensor network states Some

More information

Quantum simulation with string-bond states: Joining PEPS and Monte Carlo

Quantum simulation with string-bond states: Joining PEPS and Monte Carlo Quantum simulation with string-bond states: Joining PEPS and Monte Carlo N. Schuch 1, A. Sfondrini 1,2, F. Mezzacapo 1, J. Cerrillo 1,3, M. Wolf 1,4, F. Verstraete 5, I. Cirac 1 1 Max-Planck-Institute

More information

Matrix Product States

Matrix Product States Matrix Product States Ian McCulloch University of Queensland Centre for Engineered Quantum Systems 28 August 2017 Hilbert space (Hilbert) space is big. Really big. You just won t believe how vastly, hugely,

More information

Quantum Convolutional Neural Networks

Quantum Convolutional Neural Networks Quantum Convolutional Neural Networks Iris Cong Soonwon Choi Mikhail D. Lukin arxiv:1810.03787 Berkeley Quantum Information Seminar October 16 th, 2018 Why quantum machine learning? Machine learning: interpret

More information

Introduction to tensor network state -- concept and algorithm. Z. Y. Xie ( 谢志远 ) ITP, Beijing

Introduction to tensor network state -- concept and algorithm. Z. Y. Xie ( 谢志远 ) ITP, Beijing Introduction to tensor network state -- concept and algorithm Z. Y. Xie ( 谢志远 ) 2018.10.29 ITP, Beijing Outline Illusion of complexity of Hilbert space Matrix product state (MPS) as lowly-entangled state

More information

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS

CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS CS 179: LECTURE 16 MODEL COMPLEXITY, REGULARIZATION, AND CONVOLUTIONAL NETS LAST TIME Intro to cudnn Deep neural nets using cublas and cudnn TODAY Building a better model for image classification Overfitting

More information

3 Symmetry Protected Topological Phase

3 Symmetry Protected Topological Phase Physics 3b Lecture 16 Caltech, 05/30/18 3 Symmetry Protected Topological Phase 3.1 Breakdown of noninteracting SPT phases with interaction Building on our previous discussion of the Majorana chain and

More information

Journal Club: Brief Introduction to Tensor Network

Journal Club: Brief Introduction to Tensor Network Journal Club: Brief Introduction to Tensor Network Wei-Han Hsiao a a The University of Chicago E-mail: weihanhsiao@uchicago.edu Abstract: This note summarizes the talk given on March 8th 2016 which was

More information

4 Matrix product states

4 Matrix product states Physics 3b Lecture 5 Caltech, 05//7 4 Matrix product states Matrix product state (MPS) is a highly useful tool in the study of interacting quantum systems in one dimension, both analytically and numerically.

More information

Introduction to Tensor Networks: PEPS, Fermions, and More

Introduction to Tensor Networks: PEPS, Fermions, and More Introduction to Tensor Networks: PEPS, Fermions, and More Román Orús Institut für Physik, Johannes Gutenberg-Universität, Mainz (Germany)! School on computational methods in quantum materials Jouvence,

More information

An introduction to tensornetwork

An introduction to tensornetwork An introduction to tensornetwork states and MERA Sissa Journal Club Andrea De Luca 29/01/2010 A typical problem We are given: A lattice with N sites On each site a C d hilbert space A quantum hamiltonian

More information

Neural Network Representation of Tensor Network and Chiral States

Neural Network Representation of Tensor Network and Chiral States Neural Network Representation of Tensor Network and Chiral States Yichen Huang ( 黄溢辰 ) 1 and Joel E. Moore 2 1 Institute for Quantum Information and Matter California Institute of Technology 2 Department

More information

Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d

Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d Román Orús University of Mainz (Germany) K. Zapp, RO, Phys. Rev. D 95, 114508 (2017) Goal of this

More information

Tensor network methods in condensed matter physics. ISSP, University of Tokyo, Tsuyoshi Okubo

Tensor network methods in condensed matter physics. ISSP, University of Tokyo, Tsuyoshi Okubo Tensor network methods in condensed matter physics ISSP, University of Tokyo, Tsuyoshi Okubo Contents Possible target of tensor network methods! Tensor network methods! Tensor network states as ground

More information

T ensor N et works. I ztok Pizorn Frank Verstraete. University of Vienna M ichigan Quantum Summer School

T ensor N et works. I ztok Pizorn Frank Verstraete. University of Vienna M ichigan Quantum Summer School T ensor N et works I ztok Pizorn Frank Verstraete University of Vienna 2010 M ichigan Quantum Summer School Matrix product states (MPS) Introduction to matrix product states Ground states of finite systems

More information

Matrix-Product states: Properties and Extensions

Matrix-Product states: Properties and Extensions New Development of Numerical Simulations in Low-Dimensional Quantum Systems: From Density Matrix Renormalization Group to Tensor Network Formulations October 27-29, 2010, Yukawa Institute for Theoretical

More information

arxiv: v2 [cs.lg] 10 Apr 2017

arxiv: v2 [cs.lg] 10 Apr 2017 Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design arxiv:1704.01552v2 [cs.lg] 10 Apr 2017 Yoav Levine David Yakira Nadav Cohen Amnon Shashua The Hebrew

More information

Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning

Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning Anton Rodomanov Higher School of Economics, Russia Bayesian methods research group (http://bayesgroup.ru) 14 March

More information

Quantum many-body systems and tensor networks: simulation methods and applications

Quantum many-body systems and tensor networks: simulation methods and applications Quantum many-body systems and tensor networks: simulation methods and applications Román Orús School of Physical Sciences, University of Queensland, Brisbane (Australia) Department of Physics and Astronomy,

More information

Lecture 3: Tensor Product Ansatz

Lecture 3: Tensor Product Ansatz Lecture 3: Tensor Product nsatz Graduate Lectures Dr Gunnar Möller Cavendish Laboratory, University of Cambridge slide credits: Philippe Corboz (ETH / msterdam) January 2014 Cavendish Laboratory Part I:

More information

Holographic Geometries from Tensor Network States

Holographic Geometries from Tensor Network States Holographic Geometries from Tensor Network States J. Molina-Vilaplana 1 1 Universidad Politécnica de Cartagena Perspectives on Quantum Many-Body Entanglement, Mainz, Sep 2013 1 Introduction & Motivation

More information

Loop optimization for tensor network renormalization

Loop optimization for tensor network renormalization Yukawa Institute for Theoretical Physics, Kyoto University, Japan June, 6 Loop optimization for tensor network renormalization Shuo Yang! Perimeter Institute for Theoretical Physics, Waterloo, Canada Zheng-Cheng

More information

Positive Tensor Network approach for simulating open quantum many-body systems

Positive Tensor Network approach for simulating open quantum many-body systems Positive Tensor Network approach for simulating open quantum many-body systems 19 / 9 / 2016 A. Werner, D. Jaschke, P. Silvi, M. Kliesch, T. Calarco, J. Eisert and S. Montangero PRL 116, 237201 (2016)

More information

Convolution and Pooling as an Infinitely Strong Prior

Convolution and Pooling as an Infinitely Strong Prior Convolution and Pooling as an Infinitely Strong Prior Sargur Srihari srihari@buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Convolutional

More information

Advanced Computation for Complex Materials

Advanced Computation for Complex Materials Advanced Computation for Complex Materials Computational Progress is brainpower limited, not machine limited Algorithms Physics Major progress in algorithms Quantum Monte Carlo Density Matrix Renormalization

More information

Renormalization of Tensor Network States

Renormalization of Tensor Network States Renormalization of Tensor Network States I. Coarse Graining Tensor Renormalization Tao Xiang Institute of Physics Chinese Academy of Sciences txiang@iphy.ac.cn Numerical Renormalization Group brief introduction

More information

Maxout Networks. Hien Quoc Dang

Maxout Networks. Hien Quoc Dang Maxout Networks Hien Quoc Dang Outline Introduction Maxout Networks Description A Universal Approximator & Proof Experiments with Maxout Why does Maxout work? Conclusion 10/12/13 Hien Quoc Dang Machine

More information

Quantum Fields, Gravity, and Complexity. Brian Swingle UMD WIP w/ Isaac Kim, IBM

Quantum Fields, Gravity, and Complexity. Brian Swingle UMD WIP w/ Isaac Kim, IBM Quantum Fields, Gravity, and Complexity Brian Swingle UMD WIP w/ Isaac Kim, IBM Influence of quantum information Easy Hard (at present) Easy Hard (at present)!! QI-inspired classical, e.g. tensor networks

More information

CS 6375 Machine Learning

CS 6375 Machine Learning CS 6375 Machine Learning Nicholas Ruozzi University of Texas at Dallas Slides adapted from David Sontag and Vibhav Gogate Course Info. Instructor: Nicholas Ruozzi Office: ECSS 3.409 Office hours: Tues.

More information

Fermionic tensor networks

Fermionic tensor networks Fermionic tensor networks Philippe Corboz, Institute for Theoretical Physics, ETH Zurich Bosons vs Fermions P. Corboz and G. Vidal, Phys. Rev. B 80, 165129 (2009) : fermionic 2D MERA P. Corboz, R. Orus,

More information

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, 2017 Spis treści Website Acknowledgments Notation xiii xv xix 1 Introduction 1 1.1 Who Should Read This Book?

More information

Fermionic topological quantum states as tensor networks

Fermionic topological quantum states as tensor networks order in topological quantum states as Jens Eisert, Freie Universität Berlin Joint work with Carolin Wille and Oliver Buerschaper Symmetry, topology, and quantum phases of matter: From to physical realizations,

More information

Fermionic tensor networks

Fermionic tensor networks Fermionic tensor networks Philippe Corboz, ETH Zurich / EPF Lausanne, Switzerland Collaborators: University of Queensland: Guifre Vidal, Roman Orus, Glen Evenbly, Jacob Jordan University of Vienna: Frank

More information

Learning to Disentangle Factors of Variation with Manifold Learning

Learning to Disentangle Factors of Variation with Manifold Learning Learning to Disentangle Factors of Variation with Manifold Learning Scott Reed Kihyuk Sohn Yuting Zhang Honglak Lee University of Michigan, Department of Electrical Engineering and Computer Science 08

More information

Simulation of Quantum Many-Body Systems

Simulation of Quantum Many-Body Systems Numerical Quantum Simulation of Matteo Rizzi - KOMET 7 - JGU Mainz Vorstellung der Arbeitsgruppen WS 15-16 recent developments in control of quantum objects (e.g., cold atoms, trapped ions) General Framework

More information

CSCI 315: Artificial Intelligence through Deep Learning

CSCI 315: Artificial Intelligence through Deep Learning CSCI 35: Artificial Intelligence through Deep Learning W&L Fall Term 27 Prof. Levy Convolutional Networks http://wernerstudio.typepad.com/.a/6ad83549adb53ef53629ccf97c-5wi Convolution: Convolution is

More information

Plaquette Renormalized Tensor Network States: Application to Frustrated Systems

Plaquette Renormalized Tensor Network States: Application to Frustrated Systems Workshop on QIS and QMP, Dec 20, 2009 Plaquette Renormalized Tensor Network States: Application to Frustrated Systems Ying-Jer Kao and Center for Quantum Science and Engineering Hsin-Chih Hsiao, Ji-Feng

More information

Machine Learning Basics

Machine Learning Basics Security and Fairness of Deep Learning Machine Learning Basics Anupam Datta CMU Spring 2019 Image Classification Image Classification Image classification pipeline Input: A training set of N images, each

More information

Quantum Hamiltonian Complexity. Itai Arad

Quantum Hamiltonian Complexity. Itai Arad 1 18 / Quantum Hamiltonian Complexity Itai Arad Centre of Quantum Technologies National University of Singapore QIP 2015 2 18 / Quantum Hamiltonian Complexity condensed matter physics QHC complexity theory

More information

arxiv: v2 [cond-mat.str-el] 28 Apr 2010

arxiv: v2 [cond-mat.str-el] 28 Apr 2010 Simulation of strongly correlated fermions in two spatial dimensions with fermionic Projected Entangled-Pair States arxiv:0912.0646v2 [cond-mat.str-el] 28 Apr 2010 Philippe Corboz, 1 Román Orús, 1 Bela

More information

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering

More information

Scale invariance on the lattice

Scale invariance on the lattice Coogee'16 Sydney Quantum Information Theory Workshop Feb 2 nd - 5 th, 2016 Scale invariance on the lattice Guifre Vidal Coogee'16 Sydney Quantum Information Theory Workshop Feb 2 nd - 5 th, 2016 Scale

More information

arxiv: v4 [quant-ph] 13 Mar 2013

arxiv: v4 [quant-ph] 13 Mar 2013 Deep learning and the renormalization group arxiv:1301.3124v4 [quant-ph] 13 Mar 2013 Cédric Bény Institut für Theoretische Physik Leibniz Universität Hannover Appelstraße 2, 30167 Hannover, Germany cedric.beny@gmail.com

More information

CSC321 Lecture 20: Autoencoders

CSC321 Lecture 20: Autoencoders CSC321 Lecture 20: Autoencoders Roger Grosse Roger Grosse CSC321 Lecture 20: Autoencoders 1 / 16 Overview Latent variable models so far: mixture models Boltzmann machines Both of these involve discrete

More information

Matrix product approximations to multipoint functions in two-dimensional conformal field theory

Matrix product approximations to multipoint functions in two-dimensional conformal field theory Matrix product approximations to multipoint functions in two-dimensional conformal field theory Robert Koenig (TUM) and Volkher B. Scholz (Ghent University) based on arxiv:1509.07414 and 1601.00470 (published

More information

Jakub Hajic Artificial Intelligence Seminar I

Jakub Hajic Artificial Intelligence Seminar I Jakub Hajic Artificial Intelligence Seminar I. 11. 11. 2014 Outline Key concepts Deep Belief Networks Convolutional Neural Networks A couple of questions Convolution Perceptron Feedforward Neural Network

More information

IPAM/UCLA, Sat 24 th Jan Numerical Approaches to Quantum Many-Body Systems. QS2009 tutorials. lecture: Tensor Networks.

IPAM/UCLA, Sat 24 th Jan Numerical Approaches to Quantum Many-Body Systems. QS2009 tutorials. lecture: Tensor Networks. IPAM/UCLA, Sat 24 th Jan 2009 umerical Approaches to Quantum Many-Body Systems QS2009 tutorials lecture: Tensor etworks Guifre Vidal Outline Tensor etworks Computation of expected values Optimization of

More information

The Density Matrix Renormalization Group: Introduction and Overview

The Density Matrix Renormalization Group: Introduction and Overview The Density Matrix Renormalization Group: Introduction and Overview Introduction to DMRG as a low entanglement approximation Entanglement Matrix Product States Minimizing the energy and DMRG sweeping The

More information

Tensor network simulations of strongly correlated quantum systems

Tensor network simulations of strongly correlated quantum systems CENTRE FOR QUANTUM TECHNOLOGIES NATIONAL UNIVERSITY OF SINGAPORE AND CLARENDON LABORATORY UNIVERSITY OF OXFORD Tensor network simulations of strongly correlated quantum systems Stephen Clark LXXT[[[GSQPEFS\EGYOEGXMZMXMIWUYERXYQGSYVWI

More information

Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers

Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers Alejandro Perdomo-Ortiz Senior Research Scientist, Quantum AI Lab. at NASA Ames Research Center and at the

More information

Renormalization of Tensor- Network States Tao Xiang

Renormalization of Tensor- Network States Tao Xiang Renormalization of Tensor- Network States Tao Xiang Institute of Physics/Institute of Theoretical Physics Chinese Academy of Sciences txiang@iphy.ac.cn Physical Background: characteristic energy scales

More information

arxiv:quant-ph/ v1 19 Mar 2006

arxiv:quant-ph/ v1 19 Mar 2006 On the simulation of quantum circuits Richard Jozsa Department of Computer Science, University of Bristol, Merchant Venturers Building, Bristol BS8 1UB U.K. Abstract arxiv:quant-ph/0603163v1 19 Mar 2006

More information

Classification with Perceptrons. Reading:

Classification with Perceptrons. Reading: Classification with Perceptrons Reading: Chapters 1-3 of Michael Nielsen's online book on neural networks covers the basics of perceptrons and multilayer neural networks We will cover material in Chapters

More information

Matrix product states for the fractional quantum Hall effect

Matrix product states for the fractional quantum Hall effect Matrix product states for the fractional quantum Hall effect Roger Mong (California Institute of Technology) University of Virginia Feb 24, 2014 Collaborators Michael Zaletel UC Berkeley (Stanford/Station

More information

Understanding Deep Learning via Physics:

Understanding Deep Learning via Physics: Understanding Deep Learning via Physics: The use of Quantum Entanglement for studying the Inductive Bias of Convolutional Networks Nadav Cohen Institute for Advanced Study Symposium on Physics and Machine

More information

Invariant Scattering Convolution Networks

Invariant Scattering Convolution Networks Invariant Scattering Convolution Networks Joan Bruna and Stephane Mallat Submitted to PAMI, Feb. 2012 Presented by Bo Chen Other important related papers: [1] S. Mallat, A Theory for Multiresolution Signal

More information

Deep Learning Srihari. Deep Belief Nets. Sargur N. Srihari

Deep Learning Srihari. Deep Belief Nets. Sargur N. Srihari Deep Belief Nets Sargur N. Srihari srihari@cedar.buffalo.edu Topics 1. Boltzmann machines 2. Restricted Boltzmann machines 3. Deep Belief Networks 4. Deep Boltzmann machines 5. Boltzmann machines for continuous

More information

Support Vector Machines

Support Vector Machines Support Vector Machines Le Song Machine Learning I CSE 6740, Fall 2013 Naïve Bayes classifier Still use Bayes decision rule for classification P y x = P x y P y P x But assume p x y = 1 is fully factorized

More information

Deep Learning (CNNs)

Deep Learning (CNNs) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Deep Learning (CNNs) Deep Learning Readings: Murphy 28 Bishop - - HTF - - Mitchell

More information

Excursion: MPS & DMRG

Excursion: MPS & DMRG Excursion: MPS & DMRG Johannes.Schachenmayer@gmail.com Acronyms for: - Matrix product states - Density matrix renormalization group Numerical methods for simulations of time dynamics of large 1D quantum

More information

Numerical Methods for Strongly Correlated Systems

Numerical Methods for Strongly Correlated Systems CLARENDON LABORATORY PHYSICS DEPARTMENT UNIVERSITY OF OXFORD and CENTRE FOR QUANTUM TECHNOLOGIES NATIONAL UNIVERSITY OF SINGAPORE Numerical Methods for Strongly Correlated Systems Dieter Jaksch Feynman

More information

Markov Chains for Tensor Network States

Markov Chains for Tensor Network States Sofyan Iblisdir University of Barcelona September 2013 arxiv 1309.4880 Outline Main issue discussed here: how to find good tensor network states? The optimal state problem Markov chains Multiplicative

More information

Statistical Learning Theory. Part I 5. Deep Learning

Statistical Learning Theory. Part I 5. Deep Learning Statistical Learning Theory Part I 5. Deep Learning Sumio Watanabe Tokyo Institute of Technology Review : Supervised Learning Training Data X 1, X 2,, X n q(x,y) =q(x)q(y x) Information Source Y 1, Y 2,,

More information

arxiv: v1 [cond-mat.str-el] 19 Jan 2012

arxiv: v1 [cond-mat.str-el] 19 Jan 2012 Perfect Sampling with Unitary Tensor Networks arxiv:1201.3974v1 [cond-mat.str-el] 19 Jan 2012 Andrew J. Ferris 1, 2 and Guifre Vidal 1, 3 1 The University of Queensland, School of Mathematics and Physics,

More information

Noise-resilient quantum circuits

Noise-resilient quantum circuits Noise-resilient quantum circuits Isaac H. Kim IBM T. J. Watson Research Center Yorktown Heights, NY Oct 10th, 2017 arxiv:1703.02093, arxiv:1703.00032, arxiv:17??.?????(w. Brian Swingle) Why don t we have

More information

Why is Deep Learning so effective?

Why is Deep Learning so effective? Ma191b Winter 2017 Geometry of Neuroscience The unreasonable effectiveness of deep learning This lecture is based entirely on the paper: Reference: Henry W. Lin and Max Tegmark, Why does deep and cheap

More information

It from Qubit Summer School

It from Qubit Summer School It from Qubit Summer School July 27 th, 2016 Tensor Networks Guifre Vidal NSERC Wednesday 27 th 9AM ecture Tensor Networks 2:30PM Problem session 5PM Focus ecture MARKUS HAURU MERA: a tensor network for

More information

Simulating Quantum Systems through Matrix Product States. Laura Foini SISSA Journal Club

Simulating Quantum Systems through Matrix Product States. Laura Foini SISSA Journal Club Simulating Quantum Systems through Matrix Product States Laura Foini SISSA Journal Club 15-04-2010 Motivations Theoretical interest in Matrix Product States Wide spectrum of their numerical applications

More information

MBL THROUGH MPS. Bryan Clark and David Pekker. arxiv: Perimeter - Urbana. Friday, October 31, 14

MBL THROUGH MPS. Bryan Clark and David Pekker. arxiv: Perimeter - Urbana. Friday, October 31, 14 MBL THROUGH MPS Bryan Clark and David Pekker arxiv:1410.2224 Perimeter - Urbana Many Body Localization... It has been suggested that interactions + (strong) disorder LX H = [h i Si z X + J Ŝ i Ŝi+1] i=1

More information

Simulation of Quantum Many-Body Systems

Simulation of Quantum Many-Body Systems Numerical Quantum Simulation of Matteo Rizzi - KOMET 337 - JGU Mainz Vorstellung der Arbeitsgruppen WS 14-15 QMBS: An interdisciplinary topic entanglement structure of relevant states anyons for q-memory

More information

STA 414/2104: Lecture 8

STA 414/2104: Lecture 8 STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable models Background PCA

More information

Scale-Invariance of Support Vector Machines based on the Triangular Kernel. Abstract

Scale-Invariance of Support Vector Machines based on the Triangular Kernel. Abstract Scale-Invariance of Support Vector Machines based on the Triangular Kernel François Fleuret Hichem Sahbi IMEDIA Research Group INRIA Domaine de Voluceau 78150 Le Chesnay, France Abstract This paper focuses

More information

Disentangling Topological Insulators by Tensor Networks

Disentangling Topological Insulators by Tensor Networks Disentangling Topological Insulators by Tensor Networks Shinsei Ryu Univ. of Illinois, Urbana-Champaign Collaborators: Ali Mollabashi (IPM Tehran) Masahiro Nozaki (Kyoto) Tadashi Takayanagi (Kyoto) Xueda

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

STA 414/2104: Lecture 8

STA 414/2104: Lecture 8 STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks Delivered by Mark Ebden With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable

More information

Learning Tetris. 1 Tetris. February 3, 2009

Learning Tetris. 1 Tetris. February 3, 2009 Learning Tetris Matt Zucker Andrew Maas February 3, 2009 1 Tetris The Tetris game has been used as a benchmark for Machine Learning tasks because its large state space (over 2 200 cell configurations are

More information

Algorithms other than SGD. CS6787 Lecture 10 Fall 2017

Algorithms other than SGD. CS6787 Lecture 10 Fall 2017 Algorithms other than SGD CS6787 Lecture 10 Fall 2017 Machine learning is not just SGD Once a model is trained, we need to use it to classify new examples This inference task is not computed with SGD There

More information

Deep Learning. Convolutional Neural Network (CNNs) Ali Ghodsi. October 30, Slides are partially based on Book in preparation, Deep Learning

Deep Learning. Convolutional Neural Network (CNNs) Ali Ghodsi. October 30, Slides are partially based on Book in preparation, Deep Learning Convolutional Neural Network (CNNs) University of Waterloo October 30, 2015 Slides are partially based on Book in preparation, by Bengio, Goodfellow, and Aaron Courville, 2015 Convolutional Networks Convolutional

More information

Machine learning quantum phases of matter

Machine learning quantum phases of matter Machine learning quantum phases of matter Summer School on Emergent Phenomena in Quantum Materials Cornell, May 2017 Simon Trebst University of Cologne Machine learning quantum phases of matter Summer

More information

Lecture Support Vector Machine (SVM) Classifiers

Lecture Support Vector Machine (SVM) Classifiers Introduction to Machine Learning Lecturer: Amir Globerson Lecture 6 Fall Semester Scribe: Yishay Mansour 6.1 Support Vector Machine (SVM) Classifiers Classification is one of the most important tasks in

More information

Simon Trebst. Quantitative Modeling of Complex Systems

Simon Trebst. Quantitative Modeling of Complex Systems Quantitative Modeling of Complex Systems Machine learning In computer science, machine learning is concerned with algorithms that allow for data analytics, most prominently dimensional reduction and feature

More information

The Perceptron. Volker Tresp Summer 2014

The Perceptron. Volker Tresp Summer 2014 The Perceptron Volker Tresp Summer 2014 1 Introduction One of the first serious learning machines Most important elements in learning tasks Collection and preprocessing of training data Definition of a

More information

Deep Learning Autoencoder Models

Deep Learning Autoencoder Models Deep Learning Autoencoder Models Davide Bacciu Dipartimento di Informatica Università di Pisa Intelligent Systems for Pattern Recognition (ISPR) Generative Models Wrap-up Deep Learning Module Lecture Generative

More information

Tensor network renormalization

Tensor network renormalization Walter Burke Institute for Theoretical Physics INAUGURAL CELEBRATION AND SYMPOSIUM Caltech, Feb 23-24, 2015 Tensor network renormalization Guifre Vidal Sherman Fairchild Prize Postdoctoral Fellow (2003-2005)

More information

The Perceptron. Volker Tresp Summer 2016

The Perceptron. Volker Tresp Summer 2016 The Perceptron Volker Tresp Summer 2016 1 Elements in Learning Tasks Collection, cleaning and preprocessing of training data Definition of a class of learning models. Often defined by the free model parameters

More information

Deep learning on 3D geometries. Hope Yao Design Informatics Lab Department of Mechanical and Aerospace Engineering

Deep learning on 3D geometries. Hope Yao Design Informatics Lab Department of Mechanical and Aerospace Engineering Deep learning on 3D geometries Hope Yao Design Informatics Lab Department of Mechanical and Aerospace Engineering Overview Background Methods Numerical Result Future improvements Conclusion Background

More information

Machine Learning Techniques

Machine Learning Techniques Machine Learning Techniques ( 機器學習技法 ) Lecture 13: Deep Learning Hsuan-Tien Lin ( 林軒田 ) htlin@csie.ntu.edu.tw Department of Computer Science & Information Engineering National Taiwan University ( 國立台灣大學資訊工程系

More information

Tensor network renormalization

Tensor network renormalization Coogee'15 Sydney Quantum Information Theory Workshop Tensor network renormalization Guifre Vidal In collaboration with GLEN EVENBLY IQIM Caltech UC Irvine Quantum Mechanics 1920-1930 Niels Bohr Albert

More information

Holographic Branching and Entanglement Renormalization

Holographic Branching and Entanglement Renormalization KITP, December 7 th 2010 Holographic Branching and Entanglement Renormalization Glen Evenbly Guifre Vidal Tensor Network Methods (DMRG, PEPS, TERG, MERA) Potentially offer general formalism to efficiently

More information

Neural Networks. Nicholas Ruozzi University of Texas at Dallas

Neural Networks. Nicholas Ruozzi University of Texas at Dallas Neural Networks Nicholas Ruozzi University of Texas at Dallas Handwritten Digit Recognition Given a collection of handwritten digits and their corresponding labels, we d like to be able to correctly classify

More information

c 2017 Society for Industrial and Applied Mathematics

c 2017 Society for Industrial and Applied Mathematics MULTISCALE MODEL. SIMUL. Vol. 15, No. 4, pp. 1423 1447 c 2017 Society for Industrial and Applied Mathematics TENSOR NETWORK SKELETONIZATION LEXING YING Abstract. We introduce a new coarse-graining algorithm,

More information

Tensor networks and deep learning

Tensor networks and deep learning Tensor networks and deep learning I. Oseledets, A. Cichocki Skoltech, Moscow 26 July 2017 What is a tensor Tensor is d-dimensional array: A(i 1,, i d ) Why tensors Many objects in machine learning can

More information

Support Vector Machines for Classification: A Statistical Portrait

Support Vector Machines for Classification: A Statistical Portrait Support Vector Machines for Classification: A Statistical Portrait Yoonkyung Lee Department of Statistics The Ohio State University May 27, 2011 The Spring Conference of Korean Statistical Society KAIST,

More information

Matrix Product Operators: Algebras and Applications

Matrix Product Operators: Algebras and Applications Matrix Product Operators: Algebras and Applications Frank Verstraete Ghent University and University of Vienna Nick Bultinck, Jutho Haegeman, Michael Marien Burak Sahinoglu, Dominic Williamson Ignacio

More information

Typical quantum states at finite temperature

Typical quantum states at finite temperature Typical quantum states at finite temperature How should one think about typical quantum states at finite temperature? Density Matrices versus pure states Why eigenstates are not typical Measuring the heat

More information

CLASSIFICATION OF SU(2)-SYMMETRIC TENSOR NETWORKS

CLASSIFICATION OF SU(2)-SYMMETRIC TENSOR NETWORKS LSSIFITION OF SU(2)-SYMMETRI TENSOR NETWORKS Matthieu Mambrini Laboratoire de Physique Théorique NRS & Université de Toulouse M.M., R. Orús, D. Poilblanc, Phys. Rev. 94, 2524 (26) D. Poilblanc, M.M., arxiv:72.595

More information

Math 671: Tensor Train decomposition methods II

Math 671: Tensor Train decomposition methods II Math 671: Tensor Train decomposition methods II Eduardo Corona 1 1 University of Michigan at Ann Arbor December 13, 2016 Table of Contents 1 What we ve talked about so far: 2 The Tensor Train decomposition

More information