Tensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS

Size: px
Start display at page:

Download "Tensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS"

Transcription

1 Tensor network vs Machine learning Song Cheng ( 程嵩 ) IOP, CAS physichengsong@iphy.ac.cn

2 Outline Tensor network in a nutshell TN concepts in machine learning TN methods in machine learning

3 Outline Tensor network in a nutshell TN concepts in machine learning TN methods in machine learning

4 Orus, R. (2014). A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States. Annals of Physics, 349,

5 Ψ v = c ijlk ijlk φ i (v 1 ) φ j (v 2 ) φ k (v 3 ) φ l (v 4 )... Tensors are local building blocks for the quantum state(like LEGO) Locality lead to low rank approximation Orus, R. (2014). A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States. Annals of Physics, 349,

6

7 1, Find right TN representation 2, Contract TN to calculate physics quantity

8

9 Coarse-graining Projection Variation

10 Outline Tensor network in a nutshell TN concepts in machine learning TN methods in machine learning

11 First contact: 2013

12 Bény, C. (2013). Deep learning and the renormalization group. arxiv: [Quant-Ph]. Retrieved from Mehta, P., & Schwab, D. J. (2014). An exact mapping between the variational renormalization group and deep learning. arxiv Preprint arxiv: Retrieved from

13 Cichocki, A. (2014). Tensor Networks for Big Data Analytics and Large-Scale Optimization Problems. arxiv: [Cs, Math]. Retrieved from

14 1. Cichocki, A. et al. Low-Rank Tensor Networks for Dimensionality Reduction and Large-Scale Optimization Problems: Perspectives and Challenges PART 1. Foundations and Trends in Machine Learning 9, (2016).

15 Fashionable: 2015

16

17 Carrasquilla, J., & Melko, R. G. (2016). Machine learning phases of matter. arxiv Preprint arxiv: Retrieved from

18 Stoudenmire, E. M., & Schwab, D. J. (2016). Supervised Learning with Quantum-Inspired Tensor Networks. arxiv: [Cond-Mat, Stat]. Retrieved from

19 Carleo, G., & Troyer, M. (2017). Solving the quantum many-body problem with artificial neural networks. Science, 355(6325),

20 Reasoning: 2016-

21 Number of atoms in universe: Number of possible samples for 28 * 28 gray image: Number of parameters in RBM : 10 4 ~10 8 Why they succeed? Possible space of image Meaningful image is limited by law of physics H. W. Lin and M. Tegmark, Why does deep and cheap learning work so well,arxiv:

22 Lin, H. W., & Tegmark, M. (2016). Why does deep and cheap learning work so well? arxiv: [Cond-Mat, Stat]. Retrieved from

23 Number of atoms in universe: Number of states in manybody Hilbert space: tremendous! Number of parameters in manybody numerical algorithm: negligible Why they succeed? Many body Hilbert space Most of physical quantum many body states fulfill the area law. H. W. Lin and M. Tegmark, Why does deep and cheap learning work so well,arxiv:

24 Similarities between machine learning and quantum many body physics: Using few parameters to approximate exponentially large number of probabilities of data. H. W. Lin and M. Tegmark, Why does deep and cheap learning work so well,arxiv:

25 Deng, D.-L., Li, X., & Sarma, S. D. (2017). Quantum Entanglement in Neural Network States. Physical Review X, 7(2).

26 Gao, X., & Duan, L.-M. (2017). Efficient Representation of Quantum Many-body States with Deep Neural Networks. arxiv: [Cond-Mat, Physics:quant-Ph]. Retrieved from

27 Chen, J., Cheng, S., Xie, H., Wang, L., & Xiang, T. (2017). On the Equivalence of Restricted Boltzmann Machines and Tensor Network States. arxiv: [Cond-Mat, Physics:quant-Ph, Stat]. Retrieved from

28 Cheng, S., Chen, J. & Wang, L. Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines. arxiv: [cond-mat, physics:physics, physics:quant-ph, stat] (2017). 1.

29 Levine, Y., Yakira, D., Cohen, N., & Shashua, A. (2017). Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design. arxiv: [Quant-Ph]. Retrieved from

30 Neural network quantum states vs tensor network states

31 Huang, Y., & Moore, J. E. (2017). Neural network representation of tensor network and chiral states. arxiv: [Cond-Mat]. Retrieved from

32 Glasser, I., Pancotti, N., August, M., Rodriguez, I. D., & Cirac, J. I. (2017). Neural Networks Quantum States, String- Bond States and chiral topological states. arxiv: [Cond-Mat, Physics:quant-Ph, Stat]. Retrieved from

33 Tensor concepts in language model

34 Gallego, A. J., & Orus, R. (2017). The physical structure of grammatical correlations: equivalences, formalizations and consequences. arxiv: [Cond-Mat, Physics:physics, Physics:quant-Ph]. Retrieved from

35 Benefit Entanglement spectrum Gauge invariance Exact tensor decomposition math Expression power evaluation Tensor Algorithm

36 Disadvantage High cost Non-local terms are excluded

37 Outline Tensor network in a nutshell TN concepts in machine learning TN method in machine learning

38 representation Stoudenmire, E. M., & Schwab, D. J. (2016). Supervised Learning with Quantum-Inspired Tensor Networks. arxiv: [Cond-Mat, Stat]. Retrieved from

39 target

40 (tensor) gradient descent

41

42

43

44 Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, Pan Zhang, Unsupervised Generative Modeling Using Matrix Product States,

45 Training Algorithm: Generative Algorithm:

46 Liu, D., Ran, S.-J., Wittek, P., Peng, C., García, R. B., Su, G., & Lewenstein, M. (2017). Machine Learning by Two-Dimensional Hierarchical Tensor Networks: A Quantum Information Theoretic Perspective on Deep Architectures. arxiv: [Cond- Mat, Physics:physics, Physics:quant-Ph, Stat]. Retrieved from

47 Liu, D., Ran, S.-J., Wittek, P., Peng, C., García, R. B., Su, G., & Lewenstein, M. (2017). Machine Learning by Two-Dimensional Hierarchical Tensor Networks: A Quantum Information Theoretic Perspective on Deep Architectures. arxiv: [Cond-Mat, Physics:physics, Physics:quant-Ph, Stat]. Retrieved from

48 Koch-Janusz, M., & Ringel, Z. (2017). Mutual Information, Neural Networks and the Renormalization Group. arxiv: [Cond-Mat]. Retrieved from

49 Stoudenmire, E. M. Learning Relevant Features of Data with Multi-scale Tensor Networks. arxiv: [condmat, stat] (2017). 1.

50

51 Gao, X., Zhang, Z. & Duan, L. An efficient quantum algorithm for generative machine learning. arxiv: [quant-ph, stat] (2017).

52 Thanks

53 Take home message Tensor networks correlated to many ML architecture Datasets in ML are not totally unfamiliar to physicists Tensor viewpoints/techniques could be transferred to ML Lots of work to be done.

Neural Network Representation of Tensor Network and Chiral States

Neural Network Representation of Tensor Network and Chiral States Neural Network Representation of Tensor Network and Chiral States Yichen Huang ( 黄溢辰 ) 1 and Joel E. Moore 2 1 Institute for Quantum Information and Matter California Institute of Technology 2 Department

More information

Machine Learning with Quantum-Inspired Tensor Networks

Machine Learning with Quantum-Inspired Tensor Networks Machine Learning with Quantum-Inspired Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 RIKEN AICS - Mar 2017 Collaboration with David

More information

Machine Learning with Tensor Networks

Machine Learning with Tensor Networks Machine Learning with Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 Beijing Jun 2017 Machine learning has physics in its DNA # " # #

More information

A Video from Google DeepMind.

A Video from Google DeepMind. A Video from Google DeepMind http://www.nature.com/nature/journal/v518/n7540/fig_tab/nature14236_sv2.html Can machine learning teach us cluster updates? Lei Wang Institute of Physics, CAS https://wangleiphy.github.io

More information

Quantum Convolutional Neural Networks

Quantum Convolutional Neural Networks Quantum Convolutional Neural Networks Iris Cong Soonwon Choi Mikhail D. Lukin arxiv:1810.03787 Berkeley Quantum Information Seminar October 16 th, 2018 Why quantum machine learning? Machine learning: interpret

More information

arxiv: v2 [cs.lg] 10 Apr 2017

arxiv: v2 [cs.lg] 10 Apr 2017 Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design arxiv:1704.01552v2 [cs.lg] 10 Apr 2017 Yoav Levine David Yakira Nadav Cohen Amnon Shashua The Hebrew

More information

Supervised quantum gate teaching for quantum hardware design

Supervised quantum gate teaching for quantum hardware design Supervised quantum gate teaching for quantum hardware design Leonardo Banchi1, Nicola Pancotti2 and Sougato Bose1 1- Department of Physics and Astronomy, University College London, Gower Street, London

More information

Understanding Deep Learning via Physics:

Understanding Deep Learning via Physics: Understanding Deep Learning via Physics: The use of Quantum Entanglement for studying the Inductive Bias of Convolutional Networks Nadav Cohen Institute for Advanced Study Symposium on Physics and Machine

More information

An EoS-meter of QCD transition from deep learning

An EoS-meter of QCD transition from deep learning An EoS-meter of QCD transition from deep learning Nan Su Frankfurt Institute for Advanced Studies with Long-Gang Pang, Kai Zhou (FIAS), Hannah Petersen, Horst Stöcker (FIAS/Uni Frankfurt/GSI), Xin-Nian

More information

Introduction to tensor network state -- concept and algorithm. Z. Y. Xie ( 谢志远 ) ITP, Beijing

Introduction to tensor network state -- concept and algorithm. Z. Y. Xie ( 谢志远 ) ITP, Beijing Introduction to tensor network state -- concept and algorithm Z. Y. Xie ( 谢志远 ) 2018.10.29 ITP, Beijing Outline Illusion of complexity of Hilbert space Matrix product state (MPS) as lowly-entangled state

More information

arxiv: v1 [cond-mat.dis-nn] 18 Jan 2017

arxiv: v1 [cond-mat.dis-nn] 18 Jan 2017 Efficient Representation of Quantum Many-body States with Deep Neural Networks Xun Gao 1 and Lu-Ming Duan 1, 2 1 Center for Quantum Information, IIIS, Tsinghua University, Beijing 100084, PR China 2 Department

More information

Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d

Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d Tensor network simulation of QED on infinite lattices: learning from (1 + 1)d, and prospects for (2 + 1)d Román Orús University of Mainz (Germany) K. Zapp, RO, Phys. Rev. D 95, 114508 (2017) Goal of this

More information

Machine learning quantum phases of matter

Machine learning quantum phases of matter Machine learning quantum phases of matter Summer School on Emergent Phenomena in Quantum Materials Cornell, May 2017 Simon Trebst University of Cologne Machine learning quantum phases of matter Summer

More information

Renormalization of Tensor Network States

Renormalization of Tensor Network States Renormalization of Tensor Network States I. Coarse Graining Tensor Renormalization Tao Xiang Institute of Physics Chinese Academy of Sciences txiang@iphy.ac.cn Numerical Renormalization Group brief introduction

More information

Introduction to Tensor Networks: PEPS, Fermions, and More

Introduction to Tensor Networks: PEPS, Fermions, and More Introduction to Tensor Networks: PEPS, Fermions, and More Román Orús Institut für Physik, Johannes Gutenberg-Universität, Mainz (Germany)! School on computational methods in quantum materials Jouvence,

More information

Loop optimization for tensor network renormalization

Loop optimization for tensor network renormalization Yukawa Institute for Theoretical Physics, Kyoto University, Japan June, 6 Loop optimization for tensor network renormalization Shuo Yang! Perimeter Institute for Theoretical Physics, Waterloo, Canada Zheng-Cheng

More information

Quantum Algorithm Survey: Cont d

Quantum Algorithm Survey: Cont d Quantum Algorithm Survey: Cont d Xiaodi Wu CS, UMIACS, and QuICS, University of Maryland Quantum Algorithms for Linear Algebraic Problems Linear equation systems A x = b, given oracle access to A, b, generate

More information

arxiv: v4 [quant-ph] 13 Mar 2013

arxiv: v4 [quant-ph] 13 Mar 2013 Deep learning and the renormalization group arxiv:1301.3124v4 [quant-ph] 13 Mar 2013 Cédric Bény Institut für Theoretische Physik Leibniz Universität Hannover Appelstraße 2, 30167 Hannover, Germany cedric.beny@gmail.com

More information

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, 2017 Spis treści Website Acknowledgments Notation xiii xv xix 1 Introduction 1 1.1 Who Should Read This Book?

More information

Contributors. KITS Workshop. Machine Learning and Many-Body Physics

Contributors. KITS Workshop. Machine Learning and Many-Body Physics KITS Workshop Machine Learning and Many-Body Physics Kavli Institute for Theoretical Sciences (KITS), UCAS http://ml2017.csp.escience.cn Jun. 28 Jul. 7, 2017 University of Chinese Academy of Sciences No.

More information

Generalized Zero-Shot Learning with Deep Calibration Network

Generalized Zero-Shot Learning with Deep Calibration Network Generalized Zero-Shot Learning with Deep Calibration Network Shichen Liu, Mingsheng Long, Jianmin Wang, and Michael I.Jordan School of Software, Tsinghua University, China KLiss, MOE; BNRist; Research

More information

Fermionic topological quantum states as tensor networks

Fermionic topological quantum states as tensor networks order in topological quantum states as Jens Eisert, Freie Universität Berlin Joint work with Carolin Wille and Oliver Buerschaper Symmetry, topology, and quantum phases of matter: From to physical realizations,

More information

Lecture 3: Tensor Product Ansatz

Lecture 3: Tensor Product Ansatz Lecture 3: Tensor Product nsatz Graduate Lectures Dr Gunnar Möller Cavendish Laboratory, University of Cambridge slide credits: Philippe Corboz (ETH / msterdam) January 2014 Cavendish Laboratory Part I:

More information

Simon Trebst. Quantitative Modeling of Complex Systems

Simon Trebst. Quantitative Modeling of Complex Systems Quantitative Modeling of Complex Systems Machine learning In computer science, machine learning is concerned with algorithms that allow for data analytics, most prominently dimensional reduction and feature

More information

Expressive Efficiency and Inductive Bias of Convolutional Networks:

Expressive Efficiency and Inductive Bias of Convolutional Networks: Expressive Efficiency and Inductive Bias of Convolutional Networks: Analysis & Design via Hierarchical Tensor Decompositions Nadav Cohen The Hebrew University of Jerusalem AAAI Spring Symposium Series

More information

Accelerated Monte Carlo simulations with restricted Boltzmann machines

Accelerated Monte Carlo simulations with restricted Boltzmann machines Accelerated simulations with restricted Boltzmann machines Li Huang 1 Lei Wang 2 1 China Academy of Engineering Physics 2 Institute of Physics, Chinese Academy of Sciences July 6, 2017 Machine Learning

More information

Deep Learning the Ising Model Near Criticality

Deep Learning the Ising Model Near Criticality Deep Learning the Ising Model ear Criticality It is empirically well supported that neural networks with deep architectures perform better than shallow networks for certain machine learning tasks involving

More information

Renormalization of Tensor- Network States Tao Xiang

Renormalization of Tensor- Network States Tao Xiang Renormalization of Tensor- Network States Tao Xiang Institute of Physics/Institute of Theoretical Physics Chinese Academy of Sciences txiang@iphy.ac.cn Physical Background: characteristic energy scales

More information

ARestricted Boltzmann machine (RBM) [1] is a probabilistic

ARestricted Boltzmann machine (RBM) [1] is a probabilistic 1 Matrix Product Operator Restricted Boltzmann Machines Cong Chen, Kim Batselier, Ching-Yun Ko, and Ngai Wong chencong@eee.hku.hk, k.batselier@tudelft.nl, cyko@eee.hku.hk, nwong@eee.hku.hk arxiv:1811.04608v1

More information

Kyle Reing University of Southern California April 18, 2018

Kyle Reing University of Southern California April 18, 2018 Renormalization Group and Information Theory Kyle Reing University of Southern California April 18, 2018 Overview Renormalization Group Overview Information Theoretic Preliminaries Real Space Mutual Information

More information

Selected Publications of Prof. Dr. Wenjian Liu

Selected Publications of Prof. Dr. Wenjian Liu Selected Publications of Prof. Dr. Wenjian Liu College of Chemistry and Molecular Engineering, Peking University, Beijing 100871, China 1 Fundamentals of relativistic molecular quantum mechanics 1. Handbook

More information

An unification of light and electron

An unification of light and electron An unification of light and electron Xiao-Gang Wen http://dao.mit.edu/ wen Three turning points in my life After 10 years closure, Deng Xiao-Ping reopened universities in 1977. I entered USTC that year.

More information

arxiv: v3 [cond-mat.stat-mech] 1 Dec 2018

arxiv: v3 [cond-mat.stat-mech] 1 Dec 2018 Machine learning algorithms based on generalized Gibbs ensembles Tatjana Puškarov and Axel Cortés Cubero arxiv:1804.03546v3 [cond-mat.stat-mech] 1 Dec 2018 Institute for Theoretical Physics, Center for

More information

A Simple Quantum Neural Net with a Periodic Activation Function

A Simple Quantum Neural Net with a Periodic Activation Function A Simple Quantum Neural Net with a Periodic Activation Function Ammar Daskin Department of Computer Engineering, Istanbul Medeniyet University, Kadikoy, Istanbul, Turkey Email: adaskin25-at-gmail-dot-com

More information

The Deep Ritz method: A deep learning-based numerical algorithm for solving variational problems

The Deep Ritz method: A deep learning-based numerical algorithm for solving variational problems The Deep Ritz method: A deep learning-based numerical algorithm for solving variational problems Weinan E 1 and Bing Yu 2 arxiv:1710.00211v1 [cs.lg] 30 Sep 2017 1 The Beijing Institute of Big Data Research,

More information

Differentiable Fine-grained Quantization for Deep Neural Network Compression

Differentiable Fine-grained Quantization for Deep Neural Network Compression Differentiable Fine-grained Quantization for Deep Neural Network Compression Hsin-Pai Cheng hc218@duke.edu Yuanjun Huang University of Science and Technology of China Anhui, China yjhuang@mail.ustc.edu.cn

More information

Supervised Learning with Tensor Networks

Supervised Learning with Tensor Networks Supervised Learning with Tensor Networks E. M. Stoudenmire Perimeter Institute for Theoretical Physics Waterloo, Ontario, N2L 2Y5, Canada David J. Schwab Department of Physics Northwestern University,

More information

Basic Principles of Unsupervised and Unsupervised

Basic Principles of Unsupervised and Unsupervised Basic Principles of Unsupervised and Unsupervised Learning Toward Deep Learning Shun ichi Amari (RIKEN Brain Science Institute) collaborators: R. Karakida, M. Okada (U. Tokyo) Deep Learning Self Organization

More information

arxiv: v2 [cond-mat.str-el] 23 Nov 2017

arxiv: v2 [cond-mat.str-el] 23 Nov 2017 Restricted-Boltzmann-Machine Learning for Solving Strongly Correlated Quantum Systems Yusuke Nomura, 1, Andrew S. Darmawan, 1 Youhei Yamaji, 1,2 and Masatoshi Imada 1 1 Department of Applied Physics, The

More information

Expressive Efficiency and Inductive Bias of Convolutional Networks:

Expressive Efficiency and Inductive Bias of Convolutional Networks: Expressive Efficiency and Inductive Bias of Convolutional Networks: Analysis & Design via Hierarchical Tensor Decompositions Nadav Cohen The Hebrew University of Jerusalem AAAI Spring Symposium Series

More information

COR-OPT Seminar Reading List Sp 18

COR-OPT Seminar Reading List Sp 18 COR-OPT Seminar Reading List Sp 18 Damek Davis January 28, 2018 References [1] S. Tu, R. Boczar, M. Simchowitz, M. Soltanolkotabi, and B. Recht. Low-rank Solutions of Linear Matrix Equations via Procrustes

More information

A machine learning perspective on the many-body problem in classical and quantum physics

A machine learning perspective on the many-body problem in classical and quantum physics A machine learning perspective on the many-body problem in classical and quantum physics Juan Felipe Carrasquilla Alvarez January 31st, 2018 MaX International Conference 2018, 29-31 January 2018. ICTP

More information

Value Function Methods. CS : Deep Reinforcement Learning Sergey Levine

Value Function Methods. CS : Deep Reinforcement Learning Sergey Levine Value Function Methods CS 294-112: Deep Reinforcement Learning Sergey Levine Class Notes 1. Homework 2 is due in one week 2. Remember to start forming final project groups and writing your proposal! Proposal

More information

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann

(Feed-Forward) Neural Networks Dr. Hajira Jabeen, Prof. Jens Lehmann (Feed-Forward) Neural Networks 2016-12-06 Dr. Hajira Jabeen, Prof. Jens Lehmann Outline In the previous lectures we have learned about tensors and factorization methods. RESCAL is a bilinear model for

More information

Convolutional Neural Networks II. Slides from Dr. Vlad Morariu

Convolutional Neural Networks II. Slides from Dr. Vlad Morariu Convolutional Neural Networks II Slides from Dr. Vlad Morariu 1 Optimization Example of optimization progress while training a neural network. (Loss over mini-batches goes down over time.) 2 Learning rate

More information

Journal Club: Brief Introduction to Tensor Network

Journal Club: Brief Introduction to Tensor Network Journal Club: Brief Introduction to Tensor Network Wei-Han Hsiao a a The University of Chicago E-mail: weihanhsiao@uchicago.edu Abstract: This note summarizes the talk given on March 8th 2016 which was

More information

Deep Neural Networks

Deep Neural Networks Deep Neural Networks DT2118 Speech and Speaker Recognition Giampiero Salvi KTH/CSC/TMH giampi@kth.se VT 2015 1 / 45 Outline State-to-Output Probability Model Artificial Neural Networks Perceptron Multi

More information

Why is Deep Learning so effective?

Why is Deep Learning so effective? Ma191b Winter 2017 Geometry of Neuroscience The unreasonable effectiveness of deep learning This lecture is based entirely on the paper: Reference: Henry W. Lin and Max Tegmark, Why does deep and cheap

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) Human Brain Neurons Input-Output Transformation Input Spikes Output Spike Spike (= a brief pulse) (Excitatory Post-Synaptic Potential)

More information

Universal Topological Phase of 2D Stabilizer Codes

Universal Topological Phase of 2D Stabilizer Codes H. Bombin, G. Duclos-Cianci, D. Poulin arxiv:1103.4606 H. Bombin arxiv:1107.2707 Universal Topological Phase of 2D Stabilizer Codes Héctor Bombín Perimeter Institute collaborators David Poulin Guillaume

More information

Multi-wind Field Output Power Prediction Method based on Energy Internet and DBPSO-LSSVM

Multi-wind Field Output Power Prediction Method based on Energy Internet and DBPSO-LSSVM , pp.128-133 http://dx.doi.org/1.14257/astl.16.138.27 Multi-wind Field Output Power Prediction Method based on Energy Internet and DBPSO-LSSVM *Jianlou Lou 1, Hui Cao 1, Bin Song 2, Jizhe Xiao 1 1 School

More information

Holographic Geometries from Tensor Network States

Holographic Geometries from Tensor Network States Holographic Geometries from Tensor Network States J. Molina-Vilaplana 1 1 Universidad Politécnica de Cartagena Perspectives on Quantum Many-Body Entanglement, Mainz, Sep 2013 1 Introduction & Motivation

More information

Machine Learning and the Renormalization Group

Machine Learning and the Renormalization Group Machine Learning and the Renormalization Group Zhiru Liu Term Essay for PHYS 563 University of Illinois at Urbana-Champaign Abstract In this essay, we reviewed the recent attempts on relating Machine Learning

More information

LEARNING SPARSE STRUCTURED ENSEMBLES WITH STOCASTIC GTADIENT MCMC SAMPLING AND NETWORK PRUNING

LEARNING SPARSE STRUCTURED ENSEMBLES WITH STOCASTIC GTADIENT MCMC SAMPLING AND NETWORK PRUNING LEARNING SPARSE STRUCTURED ENSEMBLES WITH STOCASTIC GTADIENT MCMC SAMPLING AND NETWORK PRUNING Yichi Zhang Zhijian Ou Speech Processing and Machine Intelligence (SPMI) Lab Department of Electronic Engineering

More information

Large-Scale Feature Learning with Spike-and-Slab Sparse Coding

Large-Scale Feature Learning with Spike-and-Slab Sparse Coding Large-Scale Feature Learning with Spike-and-Slab Sparse Coding Ian J. Goodfellow, Aaron Courville, Yoshua Bengio ICML 2012 Presented by Xin Yuan January 17, 2013 1 Outline Contributions Spike-and-Slab

More information

Normalization Techniques in Training of Deep Neural Networks

Normalization Techniques in Training of Deep Neural Networks Normalization Techniques in Training of Deep Neural Networks Lei Huang ( 黄雷 ) State Key Laboratory of Software Development Environment, Beihang University Mail:huanglei@nlsde.buaa.edu.cn August 17 th,

More information

Tensor network renormalization

Tensor network renormalization Coogee'15 Sydney Quantum Information Theory Workshop Tensor network renormalization Guifre Vidal In collaboration with GLEN EVENBLY IQIM Caltech UC Irvine Quantum Mechanics 1920-1930 Niels Bohr Albert

More information

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others)

Machine Learning. Neural Networks. (slides from Domingos, Pardo, others) Machine Learning Neural Networks (slides from Domingos, Pardo, others) For this week, Reading Chapter 4: Neural Networks (Mitchell, 1997) See Canvas For subsequent weeks: Scaling Learning Algorithms toward

More information

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen

Artificial Neural Networks. Introduction to Computational Neuroscience Tambet Matiisen Artificial Neural Networks Introduction to Computational Neuroscience Tambet Matiisen 2.04.2018 Artificial neural network NB! Inspired by biology, not based on biology! Applications Automatic speech recognition

More information

Quantum secret sharing based on quantum error-correcting codes

Quantum secret sharing based on quantum error-correcting codes Quantum secret sharing based on quantum error-correcting codes Zhang Zu-Rong( ), Liu Wei-Tao( ), and Li Cheng-Zu( ) Department of Physics, School of Science, National University of Defense Technology,

More information

Guiding Monte Carlo Simulations with Machine Learning

Guiding Monte Carlo Simulations with Machine Learning Guiding Monte Carlo Simulations with Machine Learning Yang Qi Department of Physics, Massachusetts Institute of Technology Joining Fudan University in 2017 KITS, June 29 2017. 1/ 33 References Works led

More information

Matrix Product States for Lattice Field Theories

Matrix Product States for Lattice Field Theories a, K. Cichy bc, J. I. Cirac a, K. Jansen b and H. Saito bd a Max-Planck-Institut für Quantenoptik, Hans-Kopfermann-Str. 1, 85748 Garching, Germany b NIC, DESY Zeuthen, Platanenallee 6, 15738 Zeuthen, Germany

More information

Machine Learning Techniques

Machine Learning Techniques Machine Learning Techniques ( 機器學習技法 ) Lecture 13: Deep Learning Hsuan-Tien Lin ( 林軒田 ) htlin@csie.ntu.edu.tw Department of Computer Science & Information Engineering National Taiwan University ( 國立台灣大學資訊工程系

More information

ParaGraphE: A Library for Parallel Knowledge Graph Embedding

ParaGraphE: A Library for Parallel Knowledge Graph Embedding ParaGraphE: A Library for Parallel Knowledge Graph Embedding Xiao-Fan Niu, Wu-Jun Li National Key Laboratory for Novel Software Technology Department of Computer Science and Technology, Nanjing University,

More information

4 Matrix product states

4 Matrix product states Physics 3b Lecture 5 Caltech, 05//7 4 Matrix product states Matrix product state (MPS) is a highly useful tool in the study of interacting quantum systems in one dimension, both analytically and numerically.

More information

Parallel and Distributed Stochastic Learning -Towards Scalable Learning for Big Data Intelligence

Parallel and Distributed Stochastic Learning -Towards Scalable Learning for Big Data Intelligence Parallel and Distributed Stochastic Learning -Towards Scalable Learning for Big Data Intelligence oé LAMDA Group H ŒÆOŽÅ Æ EâX ^ #EâI[ : liwujun@nju.edu.cn Dec 10, 2016 Wu-Jun Li (http://cs.nju.edu.cn/lwj)

More information

TUTORIAL PART 1 Unsupervised Learning

TUTORIAL PART 1 Unsupervised Learning TUTORIAL PART 1 Unsupervised Learning Marc'Aurelio Ranzato Department of Computer Science Univ. of Toronto ranzato@cs.toronto.edu Co-organizers: Honglak Lee, Yoshua Bengio, Geoff Hinton, Yann LeCun, Andrew

More information

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima.

Statistical Machine Learning Theory. From Multi-class Classification to Structured Output Prediction. Hisashi Kashima. http://goo.gl/xilnmn Course website KYOTO UNIVERSITY Statistical Machine Learning Theory From Multi-class Classification to Structured Output Prediction Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT

More information

arxiv: v1 [quant-ph] 6 Apr 2018

arxiv: v1 [quant-ph] 6 Apr 2018 Quantum Machine Learning Matrix Product States Jacob D Biamonte 1,2,* 1 Deep Quantum Labs, Skolkovo Institute of Science and Technology, 3 Nobel Street, Moscow, Russia 121205 2 Institute for Quantum Computing,

More information

Nonparametric Inference for Auto-Encoding Variational Bayes

Nonparametric Inference for Auto-Encoding Variational Bayes Nonparametric Inference for Auto-Encoding Variational Bayes Erik Bodin * Iman Malik * Carl Henrik Ek * Neill D. F. Campbell * University of Bristol University of Bath Variational approximations are an

More information

IN the field of ElectroMagnetics (EM), Boundary Value

IN the field of ElectroMagnetics (EM), Boundary Value 1 A Neural Network Based ElectroMagnetic Solver Sethu Hareesh Kolluru (hareesh@stanford.edu) General Machine Learning I. BACKGROUND AND MOTIVATION IN the field of ElectroMagnetics (EM), Boundary Value

More information

Matrix Product Operators: Algebras and Applications

Matrix Product Operators: Algebras and Applications Matrix Product Operators: Algebras and Applications Frank Verstraete Ghent University and University of Vienna Nick Bultinck, Jutho Haegeman, Michael Marien Burak Sahinoglu, Dominic Williamson Ignacio

More information

News on tensor network algorithms

News on tensor network algorithms News on tensor network algorithms Román Orús Donostia International Physics Center (DIPC) December 6th 2018 S. S. Jahromi, RO, M. Kargarian, A. Langari, PRB 97, 115162 (2018) S. S. Jahromi, RO, PRB 98,

More information

arxiv:quant-ph/ v5 10 Feb 2003

arxiv:quant-ph/ v5 10 Feb 2003 Quantum entanglement of identical particles Yu Shi Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Wilberforce Road, Cambridge CB3 0WA, United Kingdom and Theory of

More information

Disentangling Topological Insulators by Tensor Networks

Disentangling Topological Insulators by Tensor Networks Disentangling Topological Insulators by Tensor Networks Shinsei Ryu Univ. of Illinois, Urbana-Champaign Collaborators: Ali Mollabashi (IPM Tehran) Masahiro Nozaki (Kyoto) Tadashi Takayanagi (Kyoto) Xueda

More information

arxiv: v1 [cond-mat.str-el] 7 Aug 2011

arxiv: v1 [cond-mat.str-el] 7 Aug 2011 Topological Geometric Entanglement of Blocks Román Orús 1, 2 and Tzu-Chieh Wei 3, 4 1 School of Mathematics and Physics, The University of Queensland, QLD 4072, Australia 2 Max-Planck-Institut für Quantenoptik,

More information

Course Structure. Psychology 452 Week 12: Deep Learning. Chapter 8 Discussion. Part I: Deep Learning: What and Why? Rufus. Rufus Processed By Fetch

Course Structure. Psychology 452 Week 12: Deep Learning. Chapter 8 Discussion. Part I: Deep Learning: What and Why? Rufus. Rufus Processed By Fetch Psychology 452 Week 12: Deep Learning What Is Deep Learning? Preliminary Ideas (that we already know!) The Restricted Boltzmann Machine (RBM) Many Layers of RBMs Pros and Cons of Deep Learning Course Structure

More information

Memory-Augmented Attention Model for Scene Text Recognition

Memory-Augmented Attention Model for Scene Text Recognition Memory-Augmented Attention Model for Scene Text Recognition Cong Wang 1,2, Fei Yin 1,2, Cheng-Lin Liu 1,2,3 1 National Laboratory of Pattern Recognition Institute of Automation, Chinese Academy of Sciences

More information

TENSOR LAYERS FOR COMPRESSION OF DEEP LEARNING NETWORKS. Cris Cecka Senior Research Scientist, NVIDIA GTC 2018

TENSOR LAYERS FOR COMPRESSION OF DEEP LEARNING NETWORKS. Cris Cecka Senior Research Scientist, NVIDIA GTC 2018 TENSOR LAYERS FOR COMPRESSION OF DEEP LEARNING NETWORKS Cris Cecka Senior Research Scientist, NVIDIA GTC 2018 Tensors Computations and the GPU AGENDA Tensor Networks and Decompositions Tensor Layers in

More information

arxiv: v2 [cond-mat.stat-mech] 31 Mar 2018

arxiv: v2 [cond-mat.stat-mech] 31 Mar 2018 Identifying Quantum Phase Transitions with Adversarial Neural Networks arxiv:1710.08382v2 [cond-mat.stat-mech] 31 Mar 2018 Patrick Huembeli, 1 Alexandre Dauphin, 1 and Peter Wittek 1 1 ICFO-Institut de

More information

Sum-Product Networks: A New Deep Architecture

Sum-Product Networks: A New Deep Architecture Sum-Product Networks: A New Deep Architecture Pedro Domingos Dept. Computer Science & Eng. University of Washington Joint work with Hoifung Poon 1 Graphical Models: Challenges Bayesian Network Markov Network

More information

Boson-Lattice Construction for Anyon Models

Boson-Lattice Construction for Anyon Models Boson-Lattice Construction for Anyon Models Belén Paredes Ludwig Maximilian University, Munich This work is an attempt to unveil the skeleton of anyon models. I present a construction to systematically

More information

The density matrix renormalization group and tensor network methods

The density matrix renormalization group and tensor network methods The density matrix renormalization group and tensor network methods Outline Steve White Exploiting the low entanglement of ground states Matrix product states and DMRG 1D 2D Tensor network states Some

More information

Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing

Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing 1 Zhong-Yuan Zhang, 2 Chris Ding, 3 Jie Tang *1, Corresponding Author School of Statistics,

More information

Modeling User s Cognitive Dynamics in Information Access and Retrieval using Quantum Probability (ESR-6)

Modeling User s Cognitive Dynamics in Information Access and Retrieval using Quantum Probability (ESR-6) Modeling User s Cognitive Dynamics in Information Access and Retrieval using Quantum Probability (ESR-6) SAGAR UPRETY THE OPEN UNIVERSITY, UK QUARTZ Mid-term Review Meeting, Padova, Italy, 07/12/2018 Self-Introduction

More information

Simulation of Quantum Many-Body Systems

Simulation of Quantum Many-Body Systems Numerical Quantum Simulation of Matteo Rizzi - KOMET 7 - JGU Mainz Vorstellung der Arbeitsgruppen WS 15-16 recent developments in control of quantum objects (e.g., cold atoms, trapped ions) General Framework

More information

Zhih-Ahn Jia. (last modified: Nov. 22, 2018 )

Zhih-Ahn Jia. (last modified: Nov. 22, 2018 ) CONTACT INFORMATION Zhih-Ahn Jia (last modified: Nov. 22, 2018 ) Email: giannjia@foxmail.com; zajia@math.ucsb.edu Research Blog: The lost worldline Homepage: http://home.ustc.edu.cn/ cajia/ Address: South

More information

A Holevo-type bound for a Hilbert Schmidt distance measure

A Holevo-type bound for a Hilbert Schmidt distance measure Journal of Quantum Information Science, 205, *,** Published Online **** 204 in SciRes. http://www.scirp.org/journal/**** http://dx.doi.org/0.4236/****.204.***** A Holevo-type bound for a Hilbert Schmidt

More information

Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers

Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers Opportunities and challenges in quantum-enhanced machine learning in near-term quantum computers Alejandro Perdomo-Ortiz Senior Research Scientist, Quantum AI Lab. at NASA Ames Research Center and at the

More information

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering

9/12/17. Types of learning. Modeling data. Supervised learning: Classification. Supervised learning: Regression. Unsupervised learning: Clustering Types of learning Modeling data Supervised: we know input and targets Goal is to learn a model that, given input data, accurately predicts target data Unsupervised: we know the input only and want to make

More information

Neural Network Tutorial & Application in Nuclear Physics. Weiguang Jiang ( 蒋炜光 ) UTK / ORNL

Neural Network Tutorial & Application in Nuclear Physics. Weiguang Jiang ( 蒋炜光 ) UTK / ORNL Neural Network Tutorial & Application in Nuclear Physics Weiguang Jiang ( 蒋炜光 ) UTK / ORNL Machine Learning Logistic Regression Gaussian Processes Neural Network Support vector machine Random Forest Genetic

More information

Applications of the tensor RG method to machine learning

Applications of the tensor RG method to machine learning Applications of the tensor RG method to machine learning Yannick Meurice The University of Iowa yannick-meurice@uiowa.edu With Sam Foreman, Joel Giedt, and Judah Unmuth-Yockey Happy birthday Thom! TGASEPC,

More information

Nonlinear Statistical Learning with Truncated Gaussian Graphical Models

Nonlinear Statistical Learning with Truncated Gaussian Graphical Models Nonlinear Statistical Learning with Truncated Gaussian Graphical Models Qinliang Su, Xuejun Liao, Changyou Chen, Lawrence Carin Department of Electrical & Computer Engineering, Duke University Presented

More information

A Parallel SGD method with Strong Convergence

A Parallel SGD method with Strong Convergence A Parallel SGD method with Strong Convergence Dhruv Mahajan Microsoft Research India dhrumaha@microsoft.com S. Sundararajan Microsoft Research India ssrajan@microsoft.com S. Sathiya Keerthi Microsoft Corporation,

More information

A Hybrid Deep Learning Approach For Chaotic Time Series Prediction Based On Unsupervised Feature Learning

A Hybrid Deep Learning Approach For Chaotic Time Series Prediction Based On Unsupervised Feature Learning A Hybrid Deep Learning Approach For Chaotic Time Series Prediction Based On Unsupervised Feature Learning Norbert Ayine Agana Advisor: Abdollah Homaifar Autonomous Control & Information Technology Institute

More information

KAI ZHANG, SHIYU LIU, AND QING WANG

KAI ZHANG, SHIYU LIU, AND QING WANG CS 229 FINAL PROJECT: STRUCTURE PREDICTION OF OPTICAL FUNCTIONAL DEVICES WITH DEEP LEARNING KAI ZHANG, SHIYU LIU, AND QING WANG x y H l m n p r Nomenclature The output electrical field, which is the input

More information

Relation of machine learning and renormalization group in Ising model

Relation of machine learning and renormalization group in Ising model Relation of machine learning and renormalization group in Ising model Shotaro Shiba (KEK) Collaborators: S. Iso and S. Yokoo (KEK, Sokendai) Reference: arxiv: 1801.07172 [hep-th] Jan. 23, 2018 @ Osaka

More information

Aspect Term Extraction with History Attention and Selective Transformation 1

Aspect Term Extraction with History Attention and Selective Transformation 1 Aspect Term Extraction with History Attention and Selective Transformation 1 Xin Li 1, Lidong Bing 2, Piji Li 1, Wai Lam 1, Zhimou Yang 3 Presenter: Lin Ma 2 1 The Chinese University of Hong Kong 2 Tencent

More information

Learning Deep Architectures for AI. Part II - Vijay Chakilam

Learning Deep Architectures for AI. Part II - Vijay Chakilam Learning Deep Architectures for AI - Yoshua Bengio Part II - Vijay Chakilam Limitations of Perceptron x1 W, b 0,1 1,1 y x2 weight plane output =1 output =0 There is no value for W and b such that the model

More information

Deep Learning Autoencoder Models

Deep Learning Autoencoder Models Deep Learning Autoencoder Models Davide Bacciu Dipartimento di Informatica Università di Pisa Intelligent Systems for Pattern Recognition (ISPR) Generative Models Wrap-up Deep Learning Module Lecture Generative

More information