Accelerated Monte Carlo simulations with restricted Boltzmann machines
|
|
- Anis Horn
- 5 years ago
- Views:
Transcription
1 Accelerated simulations with restricted Boltzmann machines Li Huang 1 Lei Wang 2 1 China Academy of Engineering Physics 2 Institute of Physics, Chinese Academy of Sciences July 6, 2017 Machine Learning and Many-Body Physics, KITS, Beijing
2
3 Quantum zoo 1 Auxiliary Field Quantum Variational Determinant Quantum World Line Quantum Continuous Time Quantum Diffusion Reptation Hybrid Quantum Gaussian Quantum Path Integral Diagrammatic Quantum Hisch-Fye Quantum Figure: Various quantum algorithms Ref: Quantum s, 2016
4 Quantum zoo 2 CT-HYB CT-INT LCT-INT CT-QMC CT-AUX CT-J Figure: Various continuous time quantum algorithms Ref: Rev Mod Phys 83, 349 (2011)
5 Troubles In my opinions, the applications of quantum methods are mainly hampered by the following three problems: How to visit the configuration space efficiently? Critical slowing down Negative sign problem (for fermionic system)
6 + Many-body physics 1 Detecting phase transitions in classic or quantum models Figure: Detecting phase transitions in Ising model with ANN Ref: Nat Phys 13, 431 (2017)
7 + Many-body physics 2 Representation of quantum many-body states Figure: Artificial neural network encoding a many-body quantum state Ref: Science 355, 602 (2017)
8 + Many-body physics 3 Solve the inverse problems Figure: Sparse modeling approach to analytical continuation of G(τ) Ref: Phys Rev E 95, (2017)
9 1 Is it possible to use ML tools to accelerate QMC? Supposed that it is difficult (not easy) to sample the target distributions by directly With the available data, we train a restricted Boltzmann machine () which can be viewed as a proxy (approximation) of the statistical distributions We simulate the trained, and then use it to propose new (local or non-local) updates The role played by the in the simulation is just like a recommender system in e-commerce platforms
10 2 How to visit the configuration space efficiently? The acceptance ratio is increased Critical slowing down The autocorrelation time is reduced Negative sign problem (for fermionic system) NO SOLUTION!
11
12 1: Hamiltonian Hamiltonian: Ĥ FK = i,j Probability distribution: Free energy : ĉ i K ijĉ j + U N i=1 ( ˆn i 1 ) ( x i 1 ) 2 2 (1) p FK (x) = e FFK(x) /Z FK (2) F FK (x) = βu 2 N x i + ln det ( 1 + e βh) (3) i=1 x i {0, 1}: occupation number of the localized fermion at site i ˆn i ĉ i ĉi: occupation number operator of the mobile fermion K is the kinetic energy matrix of the mobile fermions
13 2: Phase diagram Ref: Phys Rev Lett 117, (2016) CDW: charge density wave insulator AI: Anderson insulator MI: Mott insulator FG: Fermi gas WL: weakly localized
14 3: Lattice QMC Bit-flip update: Detailed balance condition: x i 1 x i (4) Cons: T (x x ) A(x x ) T (x x) A(x x) = p FK(x ) p FK (x) (5) Scaling: O(N 4 ) Long autocorrelation times
15 1: Architecture The is a classical statistical mechanics system defined by the following energy function N M N M E (x, h) = a i x i b j h j x i W ij h j, (6) i=1 j=1 i=1 j=1 hidden layer h 1 h 2 h 3 h 4 h 5 h M b j W ij x 1 x 2 x 3 x N a i visible layer
16 2: Recent applications Connecting with tensor networks states Ψ (v) = h e E(v,h) = i,j e a iv i (1 + e b j+ i v iw ij ) (7) Ref: arxiv: Ψ MPS (v) = Tr ( nv ) A (i) [v i ] i=1 (8)
17 3: Recent applications Searching new cluster updates Ref: arxiv:
18 1: Collecting data Collecting training data set {x} by traditional method in a preliminary calculation
19 2: Training Joint probability distribution of the visible and hidden variables p(x, h) = e E(x,h) /Z (9) Marginal distribution p(x) = h p(x, h) = e F (x) /Z (10) Free energy of N M ) F (x) = a i x i + ln (1 + e bj+ N i=1 xiwij i=1 j=1 (11)
20 3: Training (cont) Free energy of N M ) F (x) = a i x i + ln (1 + e bj+ N i=1 xiwij i=1 j=1 (12) Free energy of F FK (x) = βu N x i + ln det ( 1 + e βh), (13) 2 i=1 We use F (x) to approximate F FK (x), and setup the parameters a i, b i, and W ij of
21 4: Training (cont) F(x) FK Test samples Figure: Fitting of the log probability of the to the one of the Falicov-Kamball model
22 5: Training (cont) Figure: Connection weights W ij of the (left) T/t = 015 (right) T/t = 013
23 6: Simulating We simulate the trained using the standard blocked Gibbs sampling approach The conditional probabilities of the hidden variables and visible variables read: p(h x) = p(x h) = p(h j = 1 x) = σ ( M p(h j x) (14) j=1 N p(x i h) (15) i=1 b j + ) N x i W ij i=1 j=1 (16) M p(x i = 1 h) = σ a i + W ij h j (17)
24 7: Simulating (cont) Std New h h apple p(h x) p(x 0 h) min 1, p(h0 ) p(h x) p(x 0 h 0 ) p(h) h 0 x x 0 x x 0 Figure: Two strategies of proposing updates using the
25 8: + MC We use the trained to propose new updates (instead of local bit-flip updates), and then apply the Metropolis algorithm to update the Markov chain [ A(x x ) = min 1, p(x) p(x ) pfk(x ] ) (18) p FK (x) Ideally, the acceptance ratio is one if the fits the Falicov- Kimball model perfectly
26
27 Preliminary results Acceptance ratio (a) Local (G) (G+M) Autocorrelation time T/t (b) T/t Figure: (a) The acceptance ratio and (b) the total energy autocorrelation time of the
28 s
29 1 Why does it work? The trained captures the target distribution correctly It is more efficient to explore the configuration space non-local updates vs local updates
30 2 Is this method general? No! Since the limitation of architecture, the MC algorithm must have binary degree of freedoms Ising and Z 2 gauge fields models Determinant quantum CT-AUX HF-QMC etc
31 3 Is this idea general? Sure We can consider as an efficient recommender system to accelerate simulations Similar works: Using classic gas model to accelerate CT-INT Ref: Phys Rev E 95, (2017) Using effective Ising model to accelerate DQMC and CT-AUX Ref: Phys Rev B 95, (2017), Phys Rev B 95, (2017), arxiv: , arxiv:
32 4 Is there space to improve it? Sure Self-guiding or online training approach Deep Boltzmann machines or deep belief networks
33 Take-home messages It is possible to design special recommender systems to accelerate some simulations The is a good recommender system for the simulation for Ref: Phys Rev B 95, (2017) Future works: Exploring reliable and efficient recommender systems for CT- HYB, which is the core computational engine in dynamical mean-field theory Ref: Phys Rev E 95, (2017)
A Video from Google DeepMind.
A Video from Google DeepMind http://www.nature.com/nature/journal/v518/n7540/fig_tab/nature14236_sv2.html Can machine learning teach us cluster updates? Lei Wang Institute of Physics, CAS https://wangleiphy.github.io
More informationClassical Monte Carlo Simulations
Classical Monte Carlo Simulations Hyejin Ju April 17, 2012 1 Introduction Why do we need numerics? One of the main goals of condensed matter is to compute expectation values O = 1 Z Tr{O e βĥ} (1) and
More informationGuiding Monte Carlo Simulations with Machine Learning
Guiding Monte Carlo Simulations with Machine Learning Yang Qi Department of Physics, Massachusetts Institute of Technology Joining Fudan University in 2017 KITS, June 29 2017. 1/ 33 References Works led
More informationSelf-learning Monte Carlo Method
Self-learning Monte Carlo Method Zi Yang Meng ( 孟子杨 ) http://ziyangmeng.iphy.ac.cn/ Know thyself "Know thyself" (Greek: γνῶθι σεαυτόν, gnothi seauton) one of the Delphic maxims and was inscribed in the
More information6. Auxiliary field continuous time quantum Monte Carlo
6. Auxiliary field continuous time quantum Monte Carlo The purpose of the auxiliary field continuous time quantum Monte Carlo method 1 is to calculate the full Greens function of the Anderson impurity
More informationKyle Reing University of Southern California April 18, 2018
Renormalization Group and Information Theory Kyle Reing University of Southern California April 18, 2018 Overview Renormalization Group Overview Information Theoretic Preliminaries Real Space Mutual Information
More informationNeural Network Representation of Tensor Network and Chiral States
Neural Network Representation of Tensor Network and Chiral States Yichen Huang ( 黄溢辰 ) 1 and Joel E. Moore 2 1 Institute for Quantum Information and Matter California Institute of Technology 2 Department
More informationVariational Autoencoders for Classical Spin Models
Variational Autoencoders for Classical Spin Models Benjamin Nosarzewski Stanford University bln@stanford.edu Abstract Lattice spin models are used to study the magnetic behavior of interacting electrons
More informationMachine learning quantum phases of matter
Machine learning quantum phases of matter Summer School on Emergent Phenomena in Quantum Materials Cornell, May 2017 Simon Trebst University of Cologne Machine learning quantum phases of matter Summer
More informationICCP Project 2 - Advanced Monte Carlo Methods Choose one of the three options below
ICCP Project 2 - Advanced Monte Carlo Methods Choose one of the three options below Introduction In statistical physics Monte Carlo methods are considered to have started in the Manhattan project (1940
More informationIntroduction to Restricted Boltzmann Machines
Introduction to Restricted Boltzmann Machines Ilija Bogunovic and Edo Collins EPFL {ilija.bogunovic,edo.collins}@epfl.ch October 13, 2014 Introduction Ingredients: 1. Probabilistic graphical models (undirected,
More informationDeep Learning Srihari. Deep Belief Nets. Sargur N. Srihari
Deep Belief Nets Sargur N. Srihari srihari@cedar.buffalo.edu Topics 1. Boltzmann machines 2. Restricted Boltzmann machines 3. Deep Belief Networks 4. Deep Boltzmann machines 5. Boltzmann machines for continuous
More informationRapid Introduction to Machine Learning/ Deep Learning
Rapid Introduction to Machine Learning/ Deep Learning Hyeong In Choi Seoul National University 1/24 Lecture 5b Markov random field (MRF) November 13, 2015 2/24 Table of contents 1 1. Objectives of Lecture
More informationDiagrammatic Monte Carlo simulation of quantum impurity models
Diagrammatic Monte Carlo simulation of quantum impurity models Philipp Werner ETH Zurich IPAM, UCLA, Jan. 2009 Outline Continuous-time auxiliary field method (CT-AUX) Weak coupling expansion and auxiliary
More informationThe Origin of Deep Learning. Lili Mou Jan, 2015
The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets
More informationLecture 16 Deep Neural Generative Models
Lecture 16 Deep Neural Generative Models CMSC 35246: Deep Learning Shubhendu Trivedi & Risi Kondor University of Chicago May 22, 2017 Approach so far: We have considered simple models and then constructed
More informationThe non-backtracking operator
The non-backtracking operator Florent Krzakala LPS, Ecole Normale Supérieure in collaboration with Paris: L. Zdeborova, A. Saade Rome: A. Decelle Würzburg: J. Reichardt Santa Fe: C. Moore, P. Zhang Berkeley:
More informationRevealing fermionic quantum criticality from new Monte Carlo techniques. Zi Yang Meng ( 孟子杨 )
Revealing fermionic quantum criticality from new Monte Carlo techniques Zi Yang Meng ( 孟子杨 ) http://ziyangmeng.iphy.ac.cn Collaborators and References Xiao Yan Xu Zi Hong Liu Chuang Chen Gao Pei Pan Yang
More informationRobust Classification using Boltzmann machines by Vasileios Vasilakakis
Robust Classification using Boltzmann machines by Vasileios Vasilakakis The scope of this report is to propose an architecture of Boltzmann machines that could be used in the context of classification,
More informationLearning Tetris. 1 Tetris. February 3, 2009
Learning Tetris Matt Zucker Andrew Maas February 3, 2009 1 Tetris The Tetris game has been used as a benchmark for Machine Learning tasks because its large state space (over 2 200 cell configurations are
More informationBayesian Networks Inference with Probabilistic Graphical Models
4190.408 2016-Spring Bayesian Networks Inference with Probabilistic Graphical Models Byoung-Tak Zhang intelligence Lab Seoul National University 4190.408 Artificial (2016-Spring) 1 Machine Learning? Learning
More informationQuantum Monte Carlo investigations of correlated electron systems, present and future. Zi Yang Meng ( 孟子杨 )
Quantum Monte Carlo investigations of correlated electron systems, present and future Zi Yang Meng ( 孟子杨 ) http://ziyangmeng.iphy.ac.cn Collaborators Xiao Yan Xu Yoni Schattner Zi Hong Liu Erez Berg Chuang
More informationDeep unsupervised learning
Deep unsupervised learning Advanced data-mining Yongdai Kim Department of Statistics, Seoul National University, South Korea Unsupervised learning In machine learning, there are 3 kinds of learning paradigm.
More informationMonte Carlo and cold gases. Lode Pollet.
Monte Carlo and cold gases Lode Pollet lpollet@physics.harvard.edu 1 Outline Classical Monte Carlo The Monte Carlo trick Markov chains Metropolis algorithm Ising model critical slowing down Quantum Monte
More informationAdvanced Computation for Complex Materials
Advanced Computation for Complex Materials Computational Progress is brainpower limited, not machine limited Algorithms Physics Major progress in algorithms Quantum Monte Carlo Density Matrix Renormalization
More informationTime Evolving Block Decimation Algorithm
Time Evolving Block Decimation Algorithm Application to bosons on a lattice Jakub Zakrzewski Marian Smoluchowski Institute of Physics and Mark Kac Complex Systems Research Center, Jagiellonian University,
More informationusing Low-Depth Quantum Circuits Guillaume Verdon Co-Founder & Chief Scientific Officer
Training Neural Networks using Low-Depth Quantum Circuits QC Meetup @YC Guillaume Verdon Co-Founder & Chief Scientific Officer guillaume@everettian.com Overview Classical Boltzmann machines? Energy-based
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationRestricted Boltzmann Machines
Restricted Boltzmann Machines http://deeplearning4.org/rbm-mnist-tutorial.html Slides from Hugo Larochelle, Geoffrey Hinton, and Yoshua Bengio CSC321: Intro to Machine Learning and Neural Networks, Winter
More informationComputational Physics. J. M. Thijssen
Computational Physics J. M. Thijssen Delft University of Technology CAMBRIDGE UNIVERSITY PRESS Contents Preface xi 1 Introduction 1 1.1 Physics and computational physics 1 1.2 Classical mechanics and statistical
More informationw2dynamics : operation and applications
w2dynamics : operation and applications Giorgio Sangiovanni ERC Kick-off Meeting, 2.9.2013 Hackers Nico Parragh (Uni Wü) Markus Wallerberger (TU) Patrik Gunacker (TU) Andreas Hausoel (Uni Wü) A solver
More informationQuantum Simulation Studies of Charge Patterns in Fermi-Bose Systems
Quantum Simulation Studies of Charge Patterns in Fermi-Bose Systems 1. Hubbard to Holstein 2. Peierls Picture of CDW Transition 3. Phonons with Dispersion 4. Holstein Model on Honeycomb Lattice 5. A New
More informationBuilding a Multi-FPGA Virtualized Restricted Boltzmann Machine Architecture Using Embedded MPI
Building a Multi-FPGA Virtualized Restricted Boltzmann Machine Architecture Using Embedded MPI Charles Lo and Paul Chow {locharl1, pc}@eecg.toronto.edu Department of Electrical and Computer Engineering
More informationCourse Structure. Psychology 452 Week 12: Deep Learning. Chapter 8 Discussion. Part I: Deep Learning: What and Why? Rufus. Rufus Processed By Fetch
Psychology 452 Week 12: Deep Learning What Is Deep Learning? Preliminary Ideas (that we already know!) The Restricted Boltzmann Machine (RBM) Many Layers of RBMs Pros and Cons of Deep Learning Course Structure
More informationKnowledge Extraction from DBNs for Images
Knowledge Extraction from DBNs for Images Son N. Tran and Artur d Avila Garcez Department of Computer Science City University London Contents 1 Introduction 2 Knowledge Extraction from DBNs 3 Experimental
More informationDeep Belief Networks are compact universal approximators
1 Deep Belief Networks are compact universal approximators Nicolas Le Roux 1, Yoshua Bengio 2 1 Microsoft Research Cambridge 2 University of Montreal Keywords: Deep Belief Networks, Universal Approximation
More informationChapter 11. Stochastic Methods Rooted in Statistical Mechanics
Chapter 11. Stochastic Methods Rooted in Statistical Mechanics Neural Networks and Learning Machines (Haykin) Lecture Notes on Self-learning Neural Algorithms Byoung-Tak Zhang School of Computer Science
More informationProbabilistic Graphical Models
10-708 Probabilistic Graphical Models Homework 3 (v1.1.0) Due Apr 14, 7:00 PM Rules: 1. Homework is due on the due date at 7:00 PM. The homework should be submitted via Gradescope. Solution to each problem
More informationLearning Deep Architectures for AI. Part II - Vijay Chakilam
Learning Deep Architectures for AI - Yoshua Bengio Part II - Vijay Chakilam Limitations of Perceptron x1 W, b 0,1 1,1 y x2 weight plane output =1 output =0 There is no value for W and b such that the model
More informationMachine Learning with Tensor Networks
Machine Learning with Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 Beijing Jun 2017 Machine learning has physics in its DNA # " # #
More informationIntelligent Systems:
Intelligent Systems: Undirected Graphical models (Factor Graphs) (2 lectures) Carsten Rother 15/01/2015 Intelligent Systems: Probabilistic Inference in DGM and UGM Roadmap for next two lectures Definition
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationSampling Methods (11/30/04)
CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with
More informationElectronic correlations in models and materials. Jan Kuneš
Electronic correlations in models and materials Jan Kuneš Outline Dynamical-mean field theory Implementation (impurity problem) Single-band Hubbard model MnO under pressure moment collapse metal-insulator
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationQUEST: QUantum Electron Simulation Toolbox
San Jose State University From the SelectedWorks of Ehsan Khatami July, 2010 QUEST: QUantum Electron Simulation Toolbox C.-R. Lee, National Tsinghua University S. Chiesa, University of California, Davis
More informationTensor network simulations of strongly correlated quantum systems
CENTRE FOR QUANTUM TECHNOLOGIES NATIONAL UNIVERSITY OF SINGAPORE AND CLARENDON LABORATORY UNIVERSITY OF OXFORD Tensor network simulations of strongly correlated quantum systems Stephen Clark LXXT[[[GSQPEFS\EGYOEGXMZMXMIWUYERXYQGSYVWI
More informationDiagrammatic Monte Carlo methods for Fermions
Diagrammatic Monte Carlo methods for Fermions Philipp Werner Department of Physics, Columbia University PRL 97, 7645 (26) PRB 74, 15517 (26) PRB 75, 8518 (27) PRB 76, 235123 (27) PRL 99, 12645 (27) PRL
More informationarxiv: v2 [cs.ne] 22 Feb 2013
Sparse Penalty in Deep Belief Networks: Using the Mixed Norm Constraint arxiv:1301.3533v2 [cs.ne] 22 Feb 2013 Xanadu C. Halkias DYNI, LSIS, Universitè du Sud, Avenue de l Université - BP20132, 83957 LA
More informationTesting a Fourier Accelerated Hybrid Monte Carlo Algorithm
Syracuse University SURFACE Physics College of Arts and Sciences 12-17-2001 Testing a Fourier Accelerated Hybrid Monte Carlo Algorithm Simon Catterall Syracuse University Sergey Karamov Syracuse University
More information9 Forward-backward algorithm, sum-product on factor graphs
Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science 6.438 Algorithms For Inference Fall 2014 9 Forward-backward algorithm, sum-product on factor graphs The previous
More informationCritical Dynamics of Two-Replica Cluster Algorithms
University of Massachusetts Amherst From the SelectedWorks of Jonathan Machta 2001 Critical Dynamics of Two-Replica Cluster Algorithms X. N. Li Jonathan Machta, University of Massachusetts Amherst Available
More informationThe Ising model and Markov chain Monte Carlo
The Ising model and Markov chain Monte Carlo Ramesh Sridharan These notes give a short description of the Ising model for images and an introduction to Metropolis-Hastings and Gibbs Markov Chain Monte
More informationRandom Field Models for Applications in Computer Vision
Random Field Models for Applications in Computer Vision Nazre Batool Post-doctorate Fellow, Team AYIN, INRIA Sophia Antipolis Outline Graphical Models Generative vs. Discriminative Classifiers Markov Random
More informationGraphical Representations and Cluster Algorithms
Graphical Representations and Cluster Algorithms Jon Machta University of Massachusetts Amherst Newton Institute, March 27, 2008 Outline Introduction to graphical representations and cluster algorithms
More informationGeneral Construction of Irreversible Kernel in Markov Chain Monte Carlo
General Construction of Irreversible Kernel in Markov Chain Monte Carlo Metropolis heat bath Suwa Todo Department of Applied Physics, The University of Tokyo Department of Physics, Boston University (from
More informationRestricted Boltzmann Machines for Collaborative Filtering
Restricted Boltzmann Machines for Collaborative Filtering Authors: Ruslan Salakhutdinov Andriy Mnih Geoffrey Hinton Benjamin Schwehn Presentation by: Ioan Stanculescu 1 Overview The Netflix prize problem
More informationGround-state properties, excitations, and response of the 2D Fermi gas
Ground-state properties, excitations, and response of the 2D Fermi gas Introduction: 2D FG and a condensed matter perspective Auxiliary-field quantum Monte Carlo calculations - exact* here Results on spin-balanced
More informationAn introduction to the dynamical mean-field theory. L. V. Pourovskii
An introduction to the dynamical mean-field theory L. V. Pourovskii Nordita school on Photon-Matter interaction, Stockholm, 06.10.2016 OUTLINE The standard density-functional-theory (DFT) framework An
More informationBias-Variance Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions
- Trade-Off in Hierarchical Probabilistic Models Using Higher-Order Feature Interactions Simon Luo The University of Sydney Data61, CSIRO simon.luo@data61.csiro.au Mahito Sugiyama National Institute of
More informationChapter 16. Structured Probabilistic Models for Deep Learning
Peng et al.: Deep Learning and Practice 1 Chapter 16 Structured Probabilistic Models for Deep Learning Peng et al.: Deep Learning and Practice 2 Structured Probabilistic Models way of using graphs to describe
More informationPhase transitions and finite-size scaling
Phase transitions and finite-size scaling Critical slowing down and cluster methods. Theory of phase transitions/ RNG Finite-size scaling Detailed treatment: Lectures on Phase Transitions and the Renormalization
More informationEnergy Based Models. Stefano Ermon, Aditya Grover. Stanford University. Lecture 13
Energy Based Models Stefano Ermon, Aditya Grover Stanford University Lecture 13 Stefano Ermon, Aditya Grover (AI Lab) Deep Generative Models Lecture 13 1 / 21 Summary Story so far Representation: Latent
More informationThe nature of superfluidity in the cold atomic unitary Fermi gas
The nature of superfluidity in the cold atomic unitary Fermi gas Introduction Yoram Alhassid (Yale University) Finite-temperature auxiliary-field Monte Carlo (AFMC) method The trapped unitary Fermi gas
More information6.867 Machine learning, lecture 23 (Jaakkola)
Lecture topics: Markov Random Fields Probabilistic inference Markov Random Fields We will briefly go over undirected graphical models or Markov Random Fields (MRFs) as they will be needed in the context
More informationTensor network methods in condensed matter physics. ISSP, University of Tokyo, Tsuyoshi Okubo
Tensor network methods in condensed matter physics ISSP, University of Tokyo, Tsuyoshi Okubo Contents Possible target of tensor network methods! Tensor network methods! Tensor network states as ground
More informationAccelerating QMC on quantum computers. Matthias Troyer
Accelerating QMC on quantum computers Matthias Troyer International Journal of Theoretical Physics, VoL 21, Nos. 6/7, 1982 Simulating Physics with Computers Richard P. Feynman Department of Physics, California
More informationRestricted Boltzmann Machines
Restricted Boltzmann Machines Boltzmann Machine(BM) A Boltzmann machine extends a stochastic Hopfield network to include hidden units. It has binary (0 or 1) visible vector unit x and hidden (latent) vector
More informationTensor network vs Machine learning. Song Cheng ( 程嵩 ) IOP, CAS
Tensor network vs Machine learning Song Cheng ( 程嵩 ) IOP, CAS physichengsong@iphy.ac.cn Outline Tensor network in a nutshell TN concepts in machine learning TN methods in machine learning Outline Tensor
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More informationTraining an RBM: Contrastive Divergence. Sargur N. Srihari
Training an RBM: Contrastive Divergence Sargur N. srihari@cedar.buffalo.edu Topics in Partition Function Definition of Partition Function 1. The log-likelihood gradient 2. Stochastic axiu likelihood and
More informationLecture 18 Generalized Belief Propagation and Free Energy Approximations
Lecture 18, Generalized Belief Propagation and Free Energy Approximations 1 Lecture 18 Generalized Belief Propagation and Free Energy Approximations In this lecture we talked about graphical models and
More informationLearning Deep Architectures
Learning Deep Architectures Yoshua Bengio, U. Montreal Microsoft Cambridge, U.K. July 7th, 2009, Montreal Thanks to: Aaron Courville, Pascal Vincent, Dumitru Erhan, Olivier Delalleau, Olivier Breuleux,
More informationPhysics 115/242 Monte Carlo simulations in Statistical Physics
Physics 115/242 Monte Carlo simulations in Statistical Physics Peter Young (Dated: May 12, 2007) For additional information on the statistical Physics part of this handout, the first two sections, I strongly
More informationThree examples of a Practical Exact Markov Chain Sampling
Three examples of a Practical Exact Markov Chain Sampling Zdravko Botev November 2007 Abstract We present three examples of exact sampling from complex multidimensional densities using Markov Chain theory
More informationMachine Learning with Quantum-Inspired Tensor Networks
Machine Learning with Quantum-Inspired Tensor Networks E.M. Stoudenmire and David J. Schwab Advances in Neural Information Processing 29 arxiv:1605.05775 RIKEN AICS - Mar 2017 Collaboration with David
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project
More informationSelected Publications of Prof. Dr. Wenjian Liu
Selected Publications of Prof. Dr. Wenjian Liu College of Chemistry and Molecular Engineering, Peking University, Beijing 100871, China 1 Fundamentals of relativistic molecular quantum mechanics 1. Handbook
More informationCS 6501: Deep Learning for Computer Graphics. Basics of Neural Networks. Connelly Barnes
CS 6501: Deep Learning for Computer Graphics Basics of Neural Networks Connelly Barnes Overview Simple neural networks Perceptron Feedforward neural networks Multilayer perceptron and properties Autoencoders
More informationUndirected Graphical Models: Markov Random Fields
Undirected Graphical Models: Markov Random Fields 40-956 Advanced Topics in AI: Probabilistic Graphical Models Sharif University of Technology Soleymani Spring 2015 Markov Random Field Structure: undirected
More informationContinuous-time quantum Monte Carlo algorithms for impurity problems
Continuous-time quantum Monte Carlo algorithms for impurity problems Michel Ferrero Centre de Physique Théorique Ecole Polytechnique, France Quantum Monte Carlo methods at work for novel phases of matter
More informationUNSUPERVISED LEARNING
UNSUPERVISED LEARNING Topics Layer-wise (unsupervised) pre-training Restricted Boltzmann Machines Auto-encoders LAYER-WISE (UNSUPERVISED) PRE-TRAINING Breakthrough in 2006 Layer-wise (unsupervised) pre-training
More informationAlgorithms other than SGD. CS6787 Lecture 10 Fall 2017
Algorithms other than SGD CS6787 Lecture 10 Fall 2017 Machine learning is not just SGD Once a model is trained, we need to use it to classify new examples This inference task is not computed with SGD There
More informationMonte Carlo Study of Planar Rotator Model with Weak Dzyaloshinsky Moriya Interaction
Commun. Theor. Phys. (Beijing, China) 46 (2006) pp. 663 667 c International Academic Publishers Vol. 46, No. 4, October 15, 2006 Monte Carlo Study of Planar Rotator Model with Weak Dzyaloshinsky Moriya
More informationDeep Learning Topological Phases of Random Systems
Deep Learning Topological Phases of Random Systems 2017/6/5 Osaka Univ. in Deep Learning and Physics Physics division Sophia University Tomi Ohtsuki with Tomoki Ohtsuki, NTT data mathemahcal systems J.
More informationGaussian Cardinality Restricted Boltzmann Machines
Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence Gaussian Cardinality Restricted Boltzmann Machines Cheng Wan, Xiaoming Jin, Guiguang Ding and Dou Shen School of Software, Tsinghua
More information/21. Tsuneya Yoshida. Collaborators: Robert Peters, Satoshi Fujimoto, and N. Kawakami 2013/6/07 (EQPCM) 1. Kyoto Univ.
2013/6/07 (EQPCM) 1 /21 Tsuneya Yoshida Kyoto Univ. Collaborators: Robert Peters, Satoshi Fujimoto, and N. Kawakami T.Y., Satoshi Fujimoto, and Norio Kawakami Phys. Rev. B 85, 125113 (2012) Outline 2 /21
More informationAdvanced Introduction to Machine Learning
10-715 Advanced Introduction to Machine Learning Homework 3 Due Nov 12, 10.30 am Rules 1. Homework is due on the due date at 10.30 am. Please hand over your homework at the beginning of class. Please see
More informationSpeaker Representation and Verification Part II. by Vasileios Vasilakakis
Speaker Representation and Verification Part II by Vasileios Vasilakakis Outline -Approaches of Neural Networks in Speaker/Speech Recognition -Feed-Forward Neural Networks -Training with Back-propagation
More informationChapter 20. Deep Generative Models
Peng et al.: Deep Learning and Practice 1 Chapter 20 Deep Generative Models Peng et al.: Deep Learning and Practice 2 Generative Models Models that are able to Provide an estimate of the probability distribution
More informationCOMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines. COMP9444 c Alan Blair, 2017
COMP9444 Neural Networks and Deep Learning 11. Boltzmann Machines COMP9444 17s2 Boltzmann Machines 1 Outline Content Addressable Memory Hopfield Network Generative Models Boltzmann Machine Restricted Boltzmann
More informationA Lattice Path Integral for Supersymmetric Quantum Mechanics
Syracuse University SURFACE Physics College of Arts and Sciences 8-3-2000 A Lattice Path Integral for Supersymmetric Quantum Mechanics Simon Catterall Syracuse University Eric B. Gregory Zhongshan University
More informationSTA 414/2104: Machine Learning
STA 414/2104: Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistics! rsalakhu@cs.toronto.edu! http://www.cs.toronto.edu/~rsalakhu/ Lecture 9 Sequential Data So far
More informationIntroduction to Graphical Models. Srikumar Ramalingam School of Computing University of Utah
Introduction to Graphical Models Srikumar Ramalingam School of Computing University of Utah Reference Christopher M. Bishop, Pattern Recognition and Machine Learning, Jonathan S. Yedidia, William T. Freeman,
More information3 : Representation of Undirected GM
10-708: Probabilistic Graphical Models 10-708, Spring 2016 3 : Representation of Undirected GM Lecturer: Eric P. Xing Scribes: Longqi Cai, Man-Chia Chang 1 MRF vs BN There are two types of graphical models:
More informationDensity estimation. Computing, and avoiding, partition functions. Iain Murray
Density estimation Computing, and avoiding, partition functions Roadmap: Motivation: density estimation Understanding annealing/tempering NADE Iain Murray School of Informatics, University of Edinburgh
More informationProbabilistic Graphical Models Homework 2: Due February 24, 2014 at 4 pm
Probabilistic Graphical Models 10-708 Homework 2: Due February 24, 2014 at 4 pm Directions. This homework assignment covers the material presented in Lectures 4-8. You must complete all four problems to
More informationBayesian Networks: Construction, Inference, Learning and Causal Interpretation. Volker Tresp Summer 2016
Bayesian Networks: Construction, Inference, Learning and Causal Interpretation Volker Tresp Summer 2016 1 Introduction So far we were mostly concerned with supervised learning: we predicted one or several
More informationMany-Body Localization. Geoffrey Ji
Many-Body Localization Geoffrey Ji Outline Aside: Quantum thermalization; ETH Single-particle (Anderson) localization Many-body localization Some phenomenology (l-bit model) Numerics & Experiments Thermalization
More information