How can ideas from quantum computing improve or speed up neuromorphic models of computation?

Similar documents
Artificial Intelligence Hopfield Networks

Hopfield Neural Network and Associative Memory. Typical Myelinated Vertebrate Motoneuron (Wikipedia) Topic 3 Polymers and Neurons Lecture 5

3.3 Discrete Hopfield Net An iterative autoassociative net similar to the nets described in the previous sections has been developed by Hopfield

Quantum Annealing for Prime Factorization

CHAPTER 3. Pattern Association. Neural Networks

Neural Nets and Symbolic Reasoning Hopfield Networks

Using a Hopfield Network: A Nuts and Bolts Approach

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

A. The Hopfield Network. III. Recurrent Neural Networks. Typical Artificial Neuron. Typical Artificial Neuron. Hopfield Network.

Neural Networks. Hopfield Nets and Auto Associators Fall 2017

Hopfield Neural Network

Learning and Memory in Neural Networks

Financial Portfolio Management using D-Wave s Quantum Optimizer: The Case of Abu Dhabi Securities Exchange

Neural Networks. Prof. Dr. Rudolf Kruse. Computational Intelligence Group Faculty for Computer Science

Neural Nets in PR. Pattern Recognition XII. Michal Haindl. Outline. Neural Nets in PR 2

Neural networks: Unsupervised learning

Consider the way we are able to retrieve a pattern from a partial key as in Figure 10 1.

Hopfield Networks and Boltzmann Machines. Christian Borgelt Artificial Neural Networks and Deep Learning 296

In biological terms, memory refers to the ability of neural systems to store activity patterns and later recall them when required.

Implementing Competitive Learning in a Quantum System

D-Wave: real quantum computer?

Neural Networks for Machine Learning. Lecture 11a Hopfield Nets

7 Rate-Based Recurrent Networks of Threshold Neurons: Basis for Associative Memory

Iterative Autoassociative Net: Bidirectional Associative Memory

Lecture 7 Artificial neural networks: Supervised learning

Computational Intelligence Lecture 6: Associative Memory

Hertz, Krogh, Palmer: Introduction to the Theory of Neural Computation. Addison-Wesley Publishing Company (1991). (v ji (1 x i ) + (1 v ji )x i )

7 Recurrent Networks of Threshold (Binary) Neurons: Basis for Associative Memory

Week 4: Hopfield Network

Statistics and Quantum Computing

Hopfield networks. Lluís A. Belanche Soft Computing Research Group

CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory

Quantum computing with superconducting qubits Towards useful applications

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

arxiv: v2 [nlin.ao] 19 May 2015

Using Variable Threshold to Increase Capacity in a Feedback Neural Network

von Neumann Architecture

Storage Capacity of Letter Recognition in Hopfield Networks

System Roadmap. Qubits 2018 D-Wave Users Conference Knoxville. Jed Whittaker D-Wave Systems Inc. September 25, 2018

Quantum Random Access Memory

Neural Network Analysis of Russian Parliament Voting Patterns

SP-CNN: A Scalable and Programmable CNN-based Accelerator. Dilan Manatunga Dr. Hyesoon Kim Dr. Saibal Mukhopadhyay

Introduction to Neural Networks

Associative Neural Networks using Matlab

High-Capacity Quantum Associative Memories

Data Mining Part 5. Prediction

Hopfield Network for Associative Memory

Artificial Neural Networks Examination, June 2005

Valiant s Neuroidal Model

Logic Learning in Hopfield Networks

Example: sending one bit of information across noisy channel. Effects of the noise: flip the bit with probability p.

Neural Networks. Associative memory 12/30/2015. Associative memories. Associative memories

The Quantum Supremacy Experiment

Artificial Neural Networks Examination, June 2004

Gates for Adiabatic Quantum Computing

Stochastic Networks Variations of the Hopfield model

( ) T. Reading. Lecture 22. Definition of Covariance. Imprinting Multiple Patterns. Characteristics of Hopfield Memory

Observation of topological phenomena in a programmable lattice of 1800 superconducting qubits

CHALMERS, GÖTEBORGS UNIVERSITET. EXAM for ARTIFICIAL NEURAL NETWORKS. COURSE CODES: FFR 135, FIM 720 GU, PhD

Neural Networks and Fuzzy Logic Rajendra Dept.of CSE ASCET

Pattern Association or Associative Networks. Jugal Kalita University of Colorado at Colorado Springs

Artificial Neural Networks Examination, March 2004

LEADING THE EVOLUTION OF COMPUTE MARK KACHMAREK HPC STRATEGIC PLANNING MANAGER APRIL 17, 2018

Lecture 4: Feed Forward Neural Networks

Associative Memories (I) Hopfield Networks

Artificial Neural Network and Fuzzy Logic

Intel s approach to Quantum Computing

Quantum Computing & Scientific Applications at ORNL

Introduction to Artificial Neural Networks

Quantum Computing. Separating the 'hope' from the 'hype' Suzanne Gildert (D-Wave Systems, Inc) 4th September :00am PST, Teleplace

CMOS Ising Computer to Help Optimize Social Infrastructure Systems

Post Von Neumann Computing

Pattern Recognition Using Associative Memories

Neural networks. Chapter 20, Section 5 1

A Study of Complex Deep Learning Networks on High Performance, Neuromorphic, and Quantum Computers

- Why aren t there more quantum algorithms? - Quantum Programming Languages. By : Amanda Cieslak and Ahmana Tarin

Simple neuron model Components of simple neuron

Terminal attractor optical associative memory with adaptive control parameter

Part 8: Neural Networks

Complexity of the quantum adiabatic algorithm

IBM Systems for Cognitive Solutions

Boltzmann Machine and Hyperbolic Activation Function in Higher Order Network

Neural Networks: Basics. Darrell Whitley Colorado State University

Parallelization of the QC-lib Quantum Computer Simulator Library

Introduction to Adiabatic Quantum Computation

Challenges in Quantum Information Science. Umesh V. Vazirani U. C. Berkeley

arxiv:quant-ph/ v4 4 Jul 2005

CS:4420 Artificial Intelligence

Neural networks. Chapter 19, Sections 1 5 1

Neural Networks Lecture 6: Associative Memory II

Multi-layer Perceptron Networks

The Minimum Number of Hidden Neurons Does Not Necessarily Provide the Best Generalization

Linear discriminant functions

Quantum Neural Network

Lecture 5: Recurrent Neural Networks

Associative content-addressable networks with exponentially many robust stable states

Neural Networks for Machine Learning. Lecture 2a An overview of the main types of neural network architecture

Neural Networks. Mark van Rossum. January 15, School of Informatics, University of Edinburgh 1 / 28

Synaptic plasticity in neuromorphic hardware. Stefano Fusi Columbia University

Transcription:

Neuromorphic Computation: Architectures, Models, Applications Associative Memory Models with Adiabatic Quantum Optimization Kathleen Hamilton, Alexander McCaskey, Jonathan Schrock, Neena Imam and Travis Humble Quantum Computing Institute Oak Ridge National Laboratory How can ideas from quantum computing improve or speed up neuromorphic models of computation? Oak Ridge, Tennessee, 30 June 2016

Neuromorphic and Quantum Computing Two exciting new models for computation Practical interest is driven by need for more efficient computational platforms Specialized applications and processors are Neuromorphic computing Adapt features of neural systems to computation Quantum computing Adapt principles of quantum physics to computation Are there applications where the specialties of each model coincide? Associate Memory Recall 2 2014 Travis S. Humble

Associative Memory Associative Memory A data storage mechanism whereby locations are identified according to stored value Content Addressable Memory (CAM) A memory that stores key-value pairs and recalls keys when provided with a value Auto-associative CAM A CAM in which the key and value are the same Random Access Memory (RAM) A memory that stores key-value pairs and recalls value when provided with a key/location Pattern matching as a form of content addressable memory 3 2014 Travis S. Humble

Models of Associative Memory Hopfield Networks An associative memory using a recurrent network of computational neurons Network state evolves toward equilibrium Discrete CAM model Consider a network of n neurons, where the i-th neuron is in a bipolar state Synaptic weight w ij couples neurons i and j ì ï z i = í îï { } z i Î ±1 +1 if å w ij z j >q j i -1 otherwise Each neuron is activated when its local field exceeds the activation threshold θ z 1 z 2 z 3 z 4 A 4-neuron network showing connectivity between nodes 4 2014 Travis S. Humble

Content-Addressable Memory Recall Storing memories in a Hopfield network The network s synaptic weights store memories memories weights energy function (k) (k) x (k) Î { ±1} n w ij = x i x j E z;q Recalling memories in a Hopfield network Evolve under a stochastic update rule z i = sign P å k=1 ( å w ij z j j -q ) i Memories are stable fixed-points Convergence guaranteed because the network energy is a Lyapunov function ( ) = - 1 2 n å z i w ij z j - q i z i i, j=1 n å i=1 5 2014 Travis S. Humble

Mapping Recall to Optimization Recast update in terms of global optimization z i = sign( å w ij z j j -q ) i z = argmin z E ( z;q ) Search for the spin configuration that minimizes the network energy Thresholds (bias) still represent best guess 6 2014 Travis S. Humble

Adiabatic Quantum Optimization A quantum algorithm that returns the lowest energy state of a Hamiltonian Evaluation makes use of the quantum superposition principle to sample configuration space Execution depends on adiabatically evolving the quantum system toward a desired Ĥ ( t) = A(t)Ĥ0 + B(t)Ĥ1 Recovery of the lowest energy state is the primitive for memory recall ( ) Ĥ1 = - w ij Ẑ i Ẑ j - q i Ẑ i E z;q å i, j å i 7 2014 Travis S. Humble

Experiments with Quantum Optimization We use the D-Wave quantum processor A special purpose quantum processor that finds the ground state of an Ising Hamiltonian Fabricated from coupled arrays of superconducting flux qubits Operated as a quantum annealer Validated as a probabilistic processor Programmability limited by hardware constraints Size (# of qubits) and bits of precision restrict range of testable problem instances Temperature and control systems limit range of execution scenarios 8 2014 Travis S. Humble

Experiments with Quantum Optimization Step 1: Specify problem instances Select P memories and set one as the target memory Calculate the synaptic weights using a learning rule L Calculate the threshold / bias for the target memory Step 2: Solve problem instance Program the Hopfield network into the hardware Execute the quantum optimization program Repeat execution N times to generate N samples Step 3: Confirm the correct memory was recalled. Compute probability to recover correct memory Problem Program Result 9 2014 Travis S. Humble

Experimental Measures of Capacity Memory size 0.10n Probability to recall memory correctly Average accuracy plotted 100 random instances per spin size Hebb learning rule Accuracy increases with increasing bias 10 2014 Travis S. Humble

Experimental Measures of Capacity Memory size 0.30 n Probability to recall memory correctly Average accuracy plotted 100 random instances per spin size Hebb learning rule Accuracy increases with increasing bias 11 2014 Travis S. Humble

Experimental Measures of Capacity Memory size 0.50 n Probability to recall memory correctly Average accuracy plotted 100 random instances per spin size Hebb learning rule Accuracy increases with increasing bias 12 2014 Travis S. Humble

Associative Memory Models with Adiabatic Quantum Optimization Concluding Points Associative memory was modeled using a discrete Hopfield network Memory recall in Hopfield networks was reduced to the quantum optimization problem Recall was validated experimentally using the D-Wave processor Recent theoretical arguments from Santra et al. suggest exponential capacity when using quantum optimizations (arixv:1602.08149) Our results find the accuracy is lower than expected, most likely due to hardware noise and constraints Continued hardware advances will should address these limits Seddiqi and Humble, Adiabatic quantum optimization for associative memory recall, Frontiers in Physics 2, 79 (2014) 13 2014 Travis S. Humble

Neuromorphic Computation: Architectures, Models, Applications Associative Memory Models with Adiabatic Quantum Optimization Kathleen Hamilton, Alexander McCaskey, Jonathan Schrock, Neena Imam and Travis Humble Quantum Computing Institute Oak Ridge National Laboratory How can ideas from quantum computing improve or speed up neuromorphic models of computation? Oak Ridge, Tennessee, 30 June 2016