Geometric Analogue of Holographic Reduced Representations

Similar documents
arxiv: v1 [cs.ai] 30 Mar 2010

arxiv:cs.ai/ v2 23 Oct 2006

arxiv:quant-ph/ v2 18 Feb 2007

arxiv: v2 [cs.ai] 17 Oct 2007

Large Patterns Make Great Symbols: An Example of Learning from Example

Quantum Computation via Sparse Distributed Representation

Representing structured relational data in Euclidean vector spaces. Tony Plate

A biologically realistic cleanup memory: Autoassociation in spiking neurons

CHAPTER 3. Pattern Association. Neural Networks

Representing Objects, Relations, and Sequences

A thesis submitted to the. School of Computing. in conformity with the requirements for. the degree of Master of Science. Queen s University

IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS. Contents

Implementation of the Synergetic Computer Algorithm on AutoCAD Platform

Error Detection and Correction: Hamming Code; Reed-Muller Code

Quantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139

How to build a brain. Cognitive Modelling. Terry & Chris Centre for Theoretical Neuroscience University of Waterloo

Implementing Competitive Learning in a Quantum System

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

A Review of Matrix Analysis

Inverses. Stephen Boyd. EE103 Stanford University. October 28, 2017

Lecture Notes 1: Vector spaces

Artificial Neural Networks Lecture Notes Part 2

This is a repository copy of Improving the associative rule chaining architecture.

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Quantum Geometric Algebra

Channel Coding and Interleaving

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag

Linear Algebra and Eigenproblems

Numerical Methods Lecture 2 Simultaneous Equations

Decomposition algorithms in Clifford algebras

Review of Linear Algebra

An Introduction to Low Density Parity Check (LDPC) Codes

Chapter 2. Error Correcting Codes. 2.1 Basic Notions

SGD and Deep Learning

Communications II Lecture 9: Error Correction Coding. Professor Kin K. Leung EEE and Computing Departments Imperial College London Copyright reserved

Associative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.

And for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)

CS Topics in Cryptography January 28, Lecture 5

Arrangements, matroids and codes

arxiv: v1 [cs.ai] 10 Jun 2010

Coset Decomposition Method for Decoding Linear Codes

Math (P)Review Part II:

Using a Hopfield Network: A Nuts and Bolts Approach

Algebraic Codes for Error Control

MATRICES. a m,1 a m,n A =

Positive semidefinite matrix approximation with a trace constraint

Persistent Homology. 128 VI Persistence

Lecture 3: Error Correcting Codes

Vectors and Matrices Statistics with Vectors and Matrices

EXTENDED FUZZY COGNITIVE MAPS

Dot Products, Transposes, and Orthogonal Projections

Neural Turing Machine. Author: Alex Graves, Greg Wayne, Ivo Danihelka Presented By: Tinghui Wang (Steve)

CSCI 252: Neural Networks and Graphical Models. Fall Term 2016 Prof. Levy. Architecture #7: The Simple Recurrent Network (Elman 1990)

An overview of key ideas

Lecture 4: Linear Algebra 1

CSCI 2570 Introduction to Nanocomputing

CS286.2 Lecture 8: A variant of QPCP for multiplayer entangled games

2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.

Outline. policies for the first part. with some potential answers... MCS 260 Lecture 10.0 Introduction to Computer Science Jan Verschelde, 9 July 2014

The Logic of Geometric Proof

Linear Algebra V = T = ( 4 3 ).

Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig

Learning and Memory in Neural Networks

Determinants and Scalar Multiplication

Some Introductory Notes on Quantum Computing

Logistic Regression. Will Monroe CS 109. Lecture Notes #22 August 14, 2017

I&C 6N. Computational Linear Algebra

The Ordinal Serial Encoding Model: Serial Memory in Spiking Neurons

Lecture 1: Systems of linear equations and their solutions

Linear Models Review

Wed Feb The vector spaces 2, 3, n. Announcements: Warm-up Exercise:

Discriminative Direction for Kernel Classifiers

A Coupled Helmholtz Machine for PCA

Large Scale Data Analysis Using Deep Learning

We use the overhead arrow to denote a column vector, i.e., a number with a direction. For example, in three-space, we write

1111: Linear Algebra I

Jim Lambers MAT 610 Summer Session Lecture 2 Notes

The query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector

Black Box Linear Algebra

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Problem Set # 1 Solution, 18.06

Entropy in Classical and Quantum Information Theory

Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs

Memories Associated with Single Neurons and Proximity Matrices

Euclidean Spaces. Euclidean Spaces. Chapter 10 -S&B

Linear Algebra for Machine Learning. Sargur N. Srihari

Computational Foundations of Cognitive Science. Definition of the Inverse. Definition of the Inverse. Definition Computation Properties

Ma/CS 6b Class 24: Error Correcting Codes

Definition (T -invariant subspace) Example. Example

Hints and Selected Answers to Homework Sets

Iterative Autoassociative Net: Bidirectional Associative Memory

Extending the Associative Rule Chaining Architecture for Multiple Arity Rules

Anomaly Detection. What is an anomaly?

Computational Foundations of Cognitive Science. Addition and Scalar Multiplication. Matrix Multiplication

Numerical Methods Lecture 2 Simultaneous Equations

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Vectors for Beginners

Quantum computing with superconducting qubits Towards useful applications

Designing Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction to Linear Algebra the EECS Way

Lecture 1: Review of linear algebra

Transcription:

Geometric Analogue of Holographic Reduced Representations Agnieszka Patyk Ph.D. Supervisor: prof. Marek Czachor Faculty of Applied Physics and Mathematics, Gdańsk University of Technology, Poland and Centrum Leo Apostel (CLEA), Vrije Universiteit Brussel, Belgium Grimma, 17 19 August 2008

Outline Background Distributed representations Previous architectures GA model Binary parametrization of the geometric product Cartan representation Example Signatures Test results Recognition tests Blade linearity Future work Computer program CartanGA

Distributed representations Example

Distributed representations Example green color + triangular shape = green triangle

Distributed representations Example green color + triangular shape = green triangle Idea

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...).

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986):

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes)

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed the units have either binary or a continuous-space values

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed the units have either binary or a continuous-space values data patterns are chosen as random vectors or matrices

Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed the units have either binary or a continuous-space values data patterns are chosen as random vectors or matrices in most distributed representations only the overall pattern of activated units has a meaning (resemblance to a multi-superimposed photograph)

Distributed representations Atomic objects and complex statements

Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat

Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat name Pat + sex male + age 66 = PSmith roles: name, sex, age fillers: Pat, male, 66

Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat name Pat + sex male + age 66 = PSmith roles: name, sex, age fillers: Pat, male, 66 see agent John + see object Fido bit Pat = John saw Fido bit Pat Roles are always atomic objects but fillers may be complex statements.

Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat name Pat + sex male + age 66 = PSmith roles: name, sex, age fillers: Pat, male, 66 see agent John + see object Fido bit Pat = John saw Fido bit Pat Roles are always atomic objects but fillers may be complex statements. Binding and chunking binding + superposition (also called chunking)

Distributed representations Decoding

Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse

Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples

Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples What is the name of PSmith? (name Pat + sex male + age 66) name 1 = Pat + noise Pat

Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples What is the name of PSmith? (name Pat + sex male + age 66) name 1 = Pat + noise Pat What did Fido do? (bite agent Fido + bite object Pat) Fido 1 = bite agent + noise bite agent

Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples What is the name of PSmith? (name Pat + sex male + age 66) name 1 = Pat + noise Pat What did Fido do? (bite agent Fido + bite object Pat) Fido 1 = bite agent + noise bite agent Clean-up memory An auto-associative collection of elements and statements (excluding single chunks) produced by the system. Given a noisy extracted vector such structure must be able to: either recall the most similar item stored...... or indicate, that no matching object had been found. We need a measure of similarity (in most models: scalar product, Hamming/Euclidean distance).

Distributed representations Requirements/Problems

Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component.

Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute.

Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute. Rules of composition and decomposition must be applicable to all elements of the domain.

Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute. Rules of composition and decomposition must be applicable to all elements of the domain. Fixed size of vectors irrespectively of the degree of complication.

Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute. Rules of composition and decomposition must be applicable to all elements of the domain. Fixed size of vectors irrespectively of the degree of complication. Advantages Noise tolerance, good error correction properties. Storage efficiency. The number of superimposed patterns is not fixed.

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997)

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000)

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie)

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003)

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n)

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3 Inverse: involution x i = x ( i)mod(n)

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3 Inverse: involution xi = x ( i)mod(n) ( circular correlation (i.e. circular convolution with an involution), e.g: (bite + bite agent Fido + bite object Pat)/ ) 3 biteagent

Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3 Inverse: involution xi = x ( i)mod(n) ( circular correlation (i.e. circular convolution with an involution), e.g: (bite + bite agent Fido + bite object Pat)/ ) 3 biteagent Similarity measure: dot product

Binary parametrization of the geometric product Binary parametrization

Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then where e 0 k = 1. e x1...x n = e x 1 1... exn n

Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x 1 1... exn n where e 0 k = 1. Example: 1 = e 0...0 e 1 = e 10...0 e 235 = e 011010...0

Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x 1 1... exn n where e 0 k = 1. Example: 1 = e 0...0 e 1 = e 10...0 e 235 = e 011010...0 Geometric product as a projective XOR representation

Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x 1 1... exn n where e 0 k = 1. Example: 1 = e 0...0 e 1 = e 10...0 e 235 = e 011010...0 Geometric product as a projective XOR representation Example: e 12 e 1 = e 110...0 e 10...0 = e 1 e 2 e 1 = e 2 e 1 e 1 = e 2 = e 010...0 = e (110...0) (10...0) = ( 1) D e (110...0) (10...0)

Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x 1 1... exn n where e 0 k = 1. Example: 1 = e 0...0 e 1 = e 10...0 e 235 = e 011010...0 Geometric product as a projective XOR representation Example: e 12 e 1 = e 110...0 e 10...0 = e 1 e 2 e 1 = e 2 e 1 e 1 = e 2 = e 010...0 = e (110...0) (10...0) = ( 1) D e (110...0) (10...0) More formally, for two arbitrary strings of bits we have: B k A l e A1...A n e B1...B n = ( 1) k<l e(a1...a n) (B 1...B n)

Matrix representation Pauli s matrices ( 0 1 σ 1 = 1 0 ) ( 0 i, σ 2 = i 0 ) ( 1 0, σ 3 = 0 1 ).

Matrix representation Pauli s matrices ( 0 1 σ 1 = 1 0 ) ( 0 i, σ 2 = i 0 ) ( 1 0, σ 3 = 0 1 ). Cartan representation b 2k = σ 1 σ 1 σ 2 1 1, }{{}}{{} n k k 1 b 2k 1 = σ 1 σ 1 σ 3 1 1. }{{}}{{} n k k 1

Example Representation PSmith = name Pat + sex male + age 66

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3 = σ 1 σ 1 σ 1 σ 3 1

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c 00111 = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c 00111 = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c 11000 = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = 1 1 1 1 ( iσ 1 )

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c 00111 = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c 11000 = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = 1 1 1 1 ( iσ 1 ) name = c 00010 = b 4 = σ 1 σ 1 σ 1 σ 2 1

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c 00111 = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c 11000 = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = 1 1 1 1 ( iσ 1 ) name = c 00010 = b 4 = σ 1 σ 1 σ 1 σ 2 1 sex = c 11100 = b 1 b 2 b 3 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 )(σ 1 σ 1 σ 1 σ 3 1) = σ 1 σ 1 σ 3 1 ( iσ 1 )

Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100 = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c 00111 = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c 11000 = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = 1 1 1 1 ( iσ 1 ) name = c 00010 = b 4 = σ 1 σ 1 σ 1 σ 2 1 sex = c 11100 = b 1 b 2 b 3 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 )(σ 1 σ 1 σ 1 σ 3 1) = σ 1 σ 1 σ 3 1 ( iσ 1 ) age = c 10001 = b 1 b 5 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 3 1 1) = 1 1 ( iσ 2 ) σ 1 σ 3

Example Encoding and Decoding PSmith

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c 00010 c 00100 + c 11100 c 00111 + c 10001 c 11000

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c 00010 c 00100 + c 11100 c 00111 + c 10001 c 11000 = c 00110 + c 11011 + c 01001

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c 00010 c 00100 + c 11100 c 00111 + c 10001 c 11000 = c 00110 + c 11011 + c 01001 PSmith s name = PSmith name 1 = PSmith name

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c 00010 c 00100 + c 11100 c 00111 + c 10001 c 11000 = c 00110 + c 11011 + c 01001 PSmith s name = PSmith name 1 = PSmith name PSmith name = ( c 00110 + c 11011 + c 01001 )c 00010

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c 00010 c 00100 + c 11100 c 00111 + c 10001 c 11000 = c 00110 + c 11011 + c 01001 PSmith s name = PSmith name 1 = PSmith name PSmith name = ( c 00110 + c 11011 + c 01001 )c 00010 = c 00100 c 11001 c 01011

Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c 00010 c 00100 + c 11100 c 00111 + c 10001 c 11000 = c 00110 + c 11011 + c 01001 PSmith s name = PSmith name 1 = PSmith name PSmith name = ( c 00110 + c 11011 + c 01001 )c 00010 = c 00100 c 11001 c 01011 = Pat + noise = Pat

Example Clean-up Scalar product Scalar (inner) product is performed by the means of matrix trace.

Example Clean-up Scalar product Scalar (inner) product is performed by the means of matrix trace. ) Pat e Pat = Tr (( c 00100 c 11001 c 01011 )c 00100 ) = Tr ( 1 + c 11101 c 01111 = 32 0

Example Clean-up Scalar product Scalar (inner) product is performed by the means of matrix trace. ) Pat e Pat = Tr (( c 00100 c 11001 c 01011 )c 00100 ) = Tr ( 1 + c 11101 c 01111 Pat e male = 0 Pat e 66 = 0 Pat e name = 0 Pat e sex = 0 Pat e age = 0 Pat PSmith = 0 = 32 0

Signatures Examples

Signatures Examples PSmith, n = 5 PSmith, n = 7 PSmith, n = 6

Signatures Examples PSmith, n = 5 PSmith, n = 7 PSmith, n = 6 Signatures: LHS-signature RHS-signature

Signatures Dimensions

Signatures Dimensions 2 n 2 signatures on one of the diagonals. Each signature is of dimensions 2 n 2 2 n 2.

Signatures Dimensions 2 n 2 signatures on one of the diagonals. Each signature is of dimensions 2 n 2 2 n 2. Cartan representation b 2k = σ 1 σ 1 σ 2 1 1, }{{}}{{} at least n 2 k 1 b 2k 1 = σ 1 σ 1 σ 3 1 1. }{{}}{{} at least n 2 k 1

Recognition tests Construction

Recognition tests Construction Plate (HRR) construction, e.g. eat + eat agent Mark + eat object thefish

Recognition tests Construction Plate (HRR) construction, e.g. eat + eat agent Mark + eat object thefish Agent-Object construction, e.g. eat agent Mark + eat object thefish

Recognition tests Construction Plate (HRR) construction, e.g. eat + eat agent Mark + eat object thefish Agent-Object construction, e.g. eat agent Mark + eat object thefish Vocabulary/Sentence set Similar sentences, e.g: Fido bit Pat. Fido bit PSmith. Pat fled from Fido. PSmith fled from Fido. Fido bit PSmith causing PSmith to flee from Fido. Fido bit Pat causing Pat to flee from Fido. John saw that Fido bit PSmith causing PSmith to flee from Fido. John saw that Fido bit Pat causing Pat to flee from Fido. etc... Altogether 42 atomic objects and 19 sentences.

Recognition tests Agent-Object construction Works better for: simple sentences and simple answers

Recognition tests Agent-Object construction Works better for: simple sentences and simple answers avg % of correct answers 100 Plate construction Agent-object construction 50 QUESTION: PSmith name ANSWER: Pat NUMBER OF TRIALS: 1000 10 5 10 15 n

Recognition tests Agent-Object construction Works better for: simple sentences and simple answers, nested sentences, from which a rather unique information is to be derived.

Recognition tests Agent-Object construction Works better for: simple sentences and simple answers, nested sentences, from which a rather unique information is to be derived. avg % of correct answers 100 Plate construction Agent-object construction 50 10 5 10 15 QUESTION: (John saw that Fido bit Pat causing Pat to flee from Fido) seeagent ANSWER: John NUMBER OF TRIALS: 1000 n

Recognition tests Plate (HRR) construction Works better for nested sentences, from which a complex information needs to be derived. avg % of correct answers 100 Plate construction Agent-object construction 50 QUESTION: (John saw that Fido bit Pat causing Pat to flee from Fido) see object 10 5 10 15 ANSWER: Fido bit Pat causing Pat to flee from Fido NUMBER OF TRIALS: 1000 n

Blade linearity

Blade linearity Problems

Blade linearity Problems How often does the system produce identical blades representing atomic objects?

Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades?

Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers?

Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers? Assumption: ideal conditions

Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers? Assumption: ideal conditions No two chunks of a sentence at any time are identical (up to a constant).

Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers? Assumption: ideal conditions No two chunks of a sentence at any time are identical (up to a constant). A C = 0 A and C share a common blade

Blade linearity One meaningful blade, L > 0 noisy blades avg number of probings giving a nonzero matrix trace 15 test results estimates 10 QUESTION: (Fido bit Pat) bite object ANSWER: Pat NUMBER OF TRIALS: 1000 5 1 MEANINGFUL/NOISY BLADES: 1 /1 5 10 15 n

Blade linearity K > 1 meaningful blades, L > 0 noisy blades avg number of probings giving a nonzero matrix trace 15 test results estimates 10 QUESTION: (Fido bit PSmith) bite object ANSWER: PSmith= name Pat + sex male + age 66 SIMILAR ANSWER: DogFido= name Fido + sex male +... 5 NUMBER OF TRIALS: 1000 MEANINGFUL/NOISY BLADES: 3/1 1 5 10 15 n

What s next? Issues

What s next? Issues Decide on construction (something between Plate and Agent-Object?).

What s next? Issues Decide on construction (something between Plate and Agent-Object?). Tests for greater n.

What s next? Issues Decide on construction (something between Plate and Agent-Object?). Tests for greater n. Scaling.

References M. Czachor, D. Aerts and B. De Moor. On geometric-algebra representation of binary spatter codes. preprint arxiv:cs/0610075 [cs.ai]. 2006. M. Czachor, D. Aerts and B. De Moor. Geometric Analogue of Holographic Reduced Representation. preprint arxiv:0710.2611. 2007. G. E. Hinton, J. L. McClelland and D. E. Rumelhart. Distributed representations. Parallel distributed processing: Explorations in the microstructure of cognition. vol. 1, 77109. The MIT Press, Cambridge, MA, 1986. P. Kanerva. Binary spatter codes of ordered k-tuples. Artificial Neural Networks ICANN Proceedings, Lecture Notes in Computer Science vol. 1112, pp. 869-873, C. von der Malsburg et al. (Eds.). Springer, Berlin, 1996. P. Kanerva. Fully distributed representation. Proc. 1997 Real World Computing Symposium (RWC97, Tokyo), pp. 358-365. Real World Computing Partnership, Tsukuba-City, Japan, 1997. T. Plate. Holographic Reduced Representation: Distributed Representation for Cognitive Structures. CSLI Publications, Stanford, 2003.