Geometric Analogue of Holographic Reduced Representations
|
|
- Kory Snow
- 5 years ago
- Views:
Transcription
1 Geometric Analogue of Holographic Reduced Representations Agnieszka Patyk Ph.D. Supervisor: prof. Marek Czachor Faculty of Applied Physics and Mathematics, Gdańsk University of Technology, Poland and Centrum Leo Apostel (CLEA), Vrije Universiteit Brussel, Belgium Grimma, August 2008
2 Outline Background Distributed representations Previous architectures GA model Binary parametrization of the geometric product Cartan representation Example Signatures Test results Recognition tests Blade linearity Future work Computer program CartanGA
3 Distributed representations Example
4 Distributed representations Example green color + triangular shape = green triangle
5 Distributed representations Example green color + triangular shape = green triangle Idea
6 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...).
7 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986):
8 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes)
9 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts
10 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed
11 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed the units have either binary or a continuous-space values
12 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed the units have either binary or a continuous-space values data patterns are chosen as random vectors or matrices
13 Distributed representations Example green color + triangular shape = green triangle Idea An opposite and an alternative of traditional data structures (lists, databases, etc...). Hinton (1986): each concept is represented over a number of units (bytes) each unit participates in the representation of some number of concepts the size of a distributed representation is usually fixed the units have either binary or a continuous-space values data patterns are chosen as random vectors or matrices in most distributed representations only the overall pattern of activated units has a meaning (resemblance to a multi-superimposed photograph)
14 Distributed representations Atomic objects and complex statements
15 Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat
16 Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat name Pat + sex male + age 66 = PSmith roles: name, sex, age fillers: Pat, male, 66
17 Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat name Pat + sex male + age 66 = PSmith roles: name, sex, age fillers: Pat, male, 66 see agent John + see object Fido bit Pat = John saw Fido bit Pat Roles are always atomic objects but fillers may be complex statements.
18 Distributed representations Atomic objects and complex statements bite agent Fido + bite object Pat = Fido bit Pat }{{}}{{} chunk chunk roles: bite agent, bite object fillers: Fido, Pat name Pat + sex male + age 66 = PSmith roles: name, sex, age fillers: Pat, male, 66 see agent John + see object Fido bit Pat = John saw Fido bit Pat Roles are always atomic objects but fillers may be complex statements. Binding and chunking binding + superposition (also called chunking)
19 Distributed representations Decoding
20 Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse
21 Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples
22 Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples What is the name of PSmith? (name Pat + sex male + age 66) name 1 = Pat + noise Pat
23 Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples What is the name of PSmith? (name Pat + sex male + age 66) name 1 = Pat + noise Pat What did Fido do? (bite agent Fido + bite object Pat) Fido 1 = bite agent + noise bite agent
24 Distributed representations Decoding decoding x 1 (approximate/pseudo) inverse Examples What is the name of PSmith? (name Pat + sex male + age 66) name 1 = Pat + noise Pat What did Fido do? (bite agent Fido + bite object Pat) Fido 1 = bite agent + noise bite agent Clean-up memory An auto-associative collection of elements and statements (excluding single chunks) produced by the system. Given a noisy extracted vector such structure must be able to: either recall the most similar item stored or indicate, that no matching object had been found. We need a measure of similarity (in most models: scalar product, Hamming/Euclidean distance).
25 Distributed representations Requirements/Problems
26 Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component.
27 Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute.
28 Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute. Rules of composition and decomposition must be applicable to all elements of the domain.
29 Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute. Rules of composition and decomposition must be applicable to all elements of the domain. Fixed size of vectors irrespectively of the degree of complication.
30 Distributed representations Requirements/Problems, + - symmetric, preserving an equal portion of every component., inverse - easy/fast to compute. Rules of composition and decomposition must be applicable to all elements of the domain. Fixed size of vectors irrespectively of the degree of complication. Advantages Noise tolerance, good error correction properties. Storage efficiency. The number of superimposed patterns is not fixed.
31 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997)
32 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000)
33 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR
34 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie)
35 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance
36 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003)
37 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n)
38 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution
39 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3
40 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3 Inverse: involution x i = x ( i)mod(n)
41 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3 Inverse: involution xi = x ( i)mod(n) ( circular correlation (i.e. circular convolution with an involution), e.g: (bite + bite agent Fido + bite object Pat)/ ) 3 biteagent
42 Previous Architectures Binary Spatter Code (Pentii Kanerva, 1997) Data: n-bit binary vectors of great length (n 10000) Both and performed by XOR + majority rule addition (with additional random vectors to break the tie) Similarity measure: Hamming distance Holographic Reduced Representation (Tony Plate, 1994, 2003) Data: normalized n-byte vectors chosen from N(0, 1/n) performed by circular convolution + normalized addition, e.g. (bite + bite agent Fido + bite object Pat)/ 3 Inverse: involution xi = x ( i)mod(n) ( circular correlation (i.e. circular convolution with an involution), e.g: (bite + bite agent Fido + bite object Pat)/ ) 3 biteagent Similarity measure: dot product
43 Binary parametrization of the geometric product Binary parametrization
44 Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then where e 0 k = 1. e x1...x n = e x exn n
45 Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x exn n where e 0 k = 1. Example: 1 = e e 1 = e e 235 = e
46 Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x exn n where e 0 k = 1. Example: 1 = e e 1 = e e 235 = e Geometric product as a projective XOR representation
47 Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x exn n where e 0 k = 1. Example: 1 = e e 1 = e e 235 = e Geometric product as a projective XOR representation Example: e 12 e 1 = e e = e 1 e 2 e 1 = e 2 e 1 e 1 = e 2 = e = e ( ) (10...0) = ( 1) D e ( ) (10...0)
48 Binary parametrization of the geometric product Binary parametrization Let {x 1,..., x n} be a string of bits and {e 1,..., e n} be base vectors. Then e x1...x n = e x exn n where e 0 k = 1. Example: 1 = e e 1 = e e 235 = e Geometric product as a projective XOR representation Example: e 12 e 1 = e e = e 1 e 2 e 1 = e 2 e 1 e 1 = e 2 = e = e ( ) (10...0) = ( 1) D e ( ) (10...0) More formally, for two arbitrary strings of bits we have: B k A l e A1...A n e B1...B n = ( 1) k<l e(a1...a n) (B 1...B n)
49 Matrix representation Pauli s matrices ( 0 1 σ 1 = 1 0 ) ( 0 i, σ 2 = i 0 ) ( 1 0, σ 3 = 0 1 ).
50 Matrix representation Pauli s matrices ( 0 1 σ 1 = 1 0 ) ( 0 i, σ 2 = i 0 ) ( 1 0, σ 3 = 0 1 ). Cartan representation b 2k = σ 1 σ 1 σ 2 1 1, }{{}}{{} n k k 1 b 2k 1 = σ 1 σ 1 σ }{{}}{{} n k k 1
51 Example Representation PSmith = name Pat + sex male + age 66
52 Example Representation PSmith = name Pat + sex male + age 66 Pat = c 00100
53 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3
54 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3 = σ 1 σ 1 σ 1 σ 3 1
55 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1
56 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = ( iσ 1 )
57 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = ( iσ 1 ) name = c = b 4 = σ 1 σ 1 σ 1 σ 2 1
58 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = ( iσ 1 ) name = c = b 4 = σ 1 σ 1 σ 1 σ 2 1 sex = c = b 1 b 2 b 3 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 )(σ 1 σ 1 σ 1 σ 3 1) = σ 1 σ 1 σ 3 1 ( iσ 1 )
59 Example Representation PSmith = name Pat + sex male + age 66 Pat = c = b 3 = σ 1 σ 1 σ 1 σ 3 1 male = c = b 3 b 4 b 5 = (σ 1 σ 1 σ 1 σ 3 1)(σ 1 σ 1 σ 1 σ 2 1)(σ 1 σ 1 σ 3 1 1) = σ 1 σ 1 σ 3 ( iσ 1 ) 1 66 = c = b 1 b 2 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 ) = ( iσ 1 ) name = c = b 4 = σ 1 σ 1 σ 1 σ 2 1 sex = c = b 1 b 2 b 3 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 1 σ 1 σ 2 )(σ 1 σ 1 σ 1 σ 3 1) = σ 1 σ 1 σ 3 1 ( iσ 1 ) age = c = b 1 b 5 = (σ 1 σ 1 σ 1 σ 1 σ 3 )(σ 1 σ 1 σ 3 1 1) = 1 1 ( iσ 2 ) σ 1 σ 3
60 Example Encoding and Decoding PSmith
61 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66
62 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c c c c c c 11000
63 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c c c c c c = c c c 01001
64 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c c c c c c = c c c PSmith s name = PSmith name 1 = PSmith name
65 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c c c c c c = c c c PSmith s name = PSmith name 1 = PSmith name PSmith name = ( c c c )c 00010
66 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c c c c c c = c c c PSmith s name = PSmith name 1 = PSmith name PSmith name = ( c c c )c = c c c 01011
67 Example Encoding and Decoding PSmith PSmith = name Pat + sex male + age 66 = c c c c c c = c c c PSmith s name = PSmith name 1 = PSmith name PSmith name = ( c c c )c = c c c = Pat + noise = Pat
68 Example Clean-up Scalar product Scalar (inner) product is performed by the means of matrix trace.
69 Example Clean-up Scalar product Scalar (inner) product is performed by the means of matrix trace. ) Pat e Pat = Tr (( c c c )c ) = Tr ( 1 + c c = 32 0
70 Example Clean-up Scalar product Scalar (inner) product is performed by the means of matrix trace. ) Pat e Pat = Tr (( c c c )c ) = Tr ( 1 + c c Pat e male = 0 Pat e 66 = 0 Pat e name = 0 Pat e sex = 0 Pat e age = 0 Pat PSmith = 0 = 32 0
71 Signatures Examples
72 Signatures Examples PSmith, n = 5 PSmith, n = 7 PSmith, n = 6
73 Signatures Examples PSmith, n = 5 PSmith, n = 7 PSmith, n = 6 Signatures: LHS-signature RHS-signature
74 Signatures Dimensions
75 Signatures Dimensions 2 n 2 signatures on one of the diagonals. Each signature is of dimensions 2 n 2 2 n 2.
76 Signatures Dimensions 2 n 2 signatures on one of the diagonals. Each signature is of dimensions 2 n 2 2 n 2. Cartan representation b 2k = σ 1 σ 1 σ 2 1 1, }{{}}{{} at least n 2 k 1 b 2k 1 = σ 1 σ 1 σ }{{}}{{} at least n 2 k 1
77 Recognition tests Construction
78 Recognition tests Construction Plate (HRR) construction, e.g. eat + eat agent Mark + eat object thefish
79 Recognition tests Construction Plate (HRR) construction, e.g. eat + eat agent Mark + eat object thefish Agent-Object construction, e.g. eat agent Mark + eat object thefish
80 Recognition tests Construction Plate (HRR) construction, e.g. eat + eat agent Mark + eat object thefish Agent-Object construction, e.g. eat agent Mark + eat object thefish Vocabulary/Sentence set Similar sentences, e.g: Fido bit Pat. Fido bit PSmith. Pat fled from Fido. PSmith fled from Fido. Fido bit PSmith causing PSmith to flee from Fido. Fido bit Pat causing Pat to flee from Fido. John saw that Fido bit PSmith causing PSmith to flee from Fido. John saw that Fido bit Pat causing Pat to flee from Fido. etc... Altogether 42 atomic objects and 19 sentences.
81 Recognition tests Agent-Object construction Works better for: simple sentences and simple answers
82 Recognition tests Agent-Object construction Works better for: simple sentences and simple answers avg % of correct answers 100 Plate construction Agent-object construction 50 QUESTION: PSmith name ANSWER: Pat NUMBER OF TRIALS: n
83 Recognition tests Agent-Object construction Works better for: simple sentences and simple answers, nested sentences, from which a rather unique information is to be derived.
84 Recognition tests Agent-Object construction Works better for: simple sentences and simple answers, nested sentences, from which a rather unique information is to be derived. avg % of correct answers 100 Plate construction Agent-object construction QUESTION: (John saw that Fido bit Pat causing Pat to flee from Fido) seeagent ANSWER: John NUMBER OF TRIALS: 1000 n
85 Recognition tests Plate (HRR) construction Works better for nested sentences, from which a complex information needs to be derived. avg % of correct answers 100 Plate construction Agent-object construction 50 QUESTION: (John saw that Fido bit Pat causing Pat to flee from Fido) see object ANSWER: Fido bit Pat causing Pat to flee from Fido NUMBER OF TRIALS: 1000 n
86 Blade linearity
87 Blade linearity Problems
88 Blade linearity Problems How often does the system produce identical blades representing atomic objects?
89 Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades?
90 Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers?
91 Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers? Assumption: ideal conditions
92 Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers? Assumption: ideal conditions No two chunks of a sentence at any time are identical (up to a constant).
93 Blade linearity Problems How often does the system produce identical blades representing atomic objects? How often does the system produce identical sentence chunks from different blades? How do the two above problems affect the number of (pseudo-) correct answers? Assumption: ideal conditions No two chunks of a sentence at any time are identical (up to a constant). A C = 0 A and C share a common blade
94 Blade linearity One meaningful blade, L > 0 noisy blades avg number of probings giving a nonzero matrix trace 15 test results estimates 10 QUESTION: (Fido bit Pat) bite object ANSWER: Pat NUMBER OF TRIALS: MEANINGFUL/NOISY BLADES: 1 / n
95 Blade linearity K > 1 meaningful blades, L > 0 noisy blades avg number of probings giving a nonzero matrix trace 15 test results estimates 10 QUESTION: (Fido bit PSmith) bite object ANSWER: PSmith= name Pat + sex male + age 66 SIMILAR ANSWER: DogFido= name Fido + sex male NUMBER OF TRIALS: 1000 MEANINGFUL/NOISY BLADES: 3/ n
96 What s next? Issues
97 What s next? Issues Decide on construction (something between Plate and Agent-Object?).
98 What s next? Issues Decide on construction (something between Plate and Agent-Object?). Tests for greater n.
99 What s next? Issues Decide on construction (something between Plate and Agent-Object?). Tests for greater n. Scaling.
100 References M. Czachor, D. Aerts and B. De Moor. On geometric-algebra representation of binary spatter codes. preprint arxiv:cs/ [cs.ai] M. Czachor, D. Aerts and B. De Moor. Geometric Analogue of Holographic Reduced Representation. preprint arxiv: G. E. Hinton, J. L. McClelland and D. E. Rumelhart. Distributed representations. Parallel distributed processing: Explorations in the microstructure of cognition. vol. 1, The MIT Press, Cambridge, MA, P. Kanerva. Binary spatter codes of ordered k-tuples. Artificial Neural Networks ICANN Proceedings, Lecture Notes in Computer Science vol. 1112, pp , C. von der Malsburg et al. (Eds.). Springer, Berlin, P. Kanerva. Fully distributed representation. Proc Real World Computing Symposium (RWC97, Tokyo), pp Real World Computing Partnership, Tsukuba-City, Japan, T. Plate. Holographic Reduced Representation: Distributed Representation for Cognitive Structures. CSLI Publications, Stanford, 2003.
arxiv: v1 [cs.ai] 30 Mar 2010
Geometric Algebra Model of Distributed Representations arxiv:1003.5899v1 [cs.ai] 30 Mar 2010 Agnieszka Patyk Abstract Formalism based on GA is an alternative to distributed representation models developed
More informationarxiv:cs.ai/ v2 23 Oct 2006
On Geometric Algebra representation of Binary Spatter Codes Diederik Aerts 1, Marek Czachor 1,2, and Bart De Moor 3 1 Centrum Leo Apostel (CLEA) and Foundations of the Exact Sciences (FUND) Brussels Free
More informationarxiv:quant-ph/ v2 18 Feb 2007
Cartoon Computation: Quantum-like computing without quantum mechanics Diederik Aerts 1 and Marek Czachor 1,2 1 Centrum Leo Apostel (CLEA) and Foundations of the Exact Sciences (FUND) Vrije Universiteit
More informationarxiv: v2 [cs.ai] 17 Oct 2007
Geometric Analogue of Holographic Reduced Representation Diederik Aerts 1, Marek Czachor 2,3, and Bart De Moor 3 1 Centrum Leo Apostel (CLEA) and Foundations of the Exact Sciences (FUND) Brussels Free
More informationLarge Patterns Make Great Symbols: An Example of Learning from Example
Large Patterns Make Great Symbols: An Example of Learning from Example Pentti Kanerva RWCP 1 Theoretical Foundation SICS 2 Laboratory SICS, Box 1263, SE-164 29 Kista, Sweden e-mail: kanerva@sics.se Abstract
More informationQuantum Computation via Sparse Distributed Representation
1 Quantum Computation via Sparse Distributed Representation Gerard J. Rinkus* ABSTRACT Quantum superposition states that any physical system simultaneously exists in all of its possible states, the number
More informationRepresenting structured relational data in Euclidean vector spaces. Tony Plate
Representing structured relational data in Euclidean vector spaces Tony Plate tplate@acm.org http://www.d-reps.org October 2004 AAAI Symposium 2004 1 Overview A general method for representing structured
More informationA biologically realistic cleanup memory: Autoassociation in spiking neurons
Available online at www.sciencedirect.com Cognitive Systems Research 12 (2011) 84 92 www.elsevier.com/locate/cogsys A biologically realistic cleanup memory: Autoassociation in spiking neurons Action editor:
More informationCHAPTER 3. Pattern Association. Neural Networks
CHAPTER 3 Pattern Association Neural Networks Pattern Association learning is the process of forming associations between related patterns. The patterns we associate together may be of the same type or
More informationRepresenting Objects, Relations, and Sequences
Representing Objects, Relations, and Sequences Stephen I. Gallant and T. Wendy Okaywe Pitney Bowes MultiModel Research MultiModel Research sgallant@mmres.com, wokaywe@mmres.com February 2, 2013 Late draft
More informationA thesis submitted to the. School of Computing. in conformity with the requirements for. the degree of Master of Science. Queen s University
ADVANCING THE THEORY AND UTILITY OF HOLOGRAPHIC REDUCED REPRESENTATIONS by MATTHEW ALEXANDER KELLY A thesis submitted to the School of Computing in conformity with the requirements for the degree of Master
More informationIRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS. Contents
IRREDUCIBLE REPRESENTATIONS OF SEMISIMPLE LIE ALGEBRAS NEEL PATEL Abstract. The goal of this paper is to study the irreducible representations of semisimple Lie algebras. We will begin by considering two
More informationImplementation of the Synergetic Computer Algorithm on AutoCAD Platform
Implementation of the Synergetic Computer Algorithm on AutoCAD Platform DIRI LOGINOV Department of Environmental Engineering allinn University of echnology Ehitajate tee 5, 9086 allinn ESONIA dmitri.loginov@gmail.com
More informationError Detection and Correction: Hamming Code; Reed-Muller Code
Error Detection and Correction: Hamming Code; Reed-Muller Code Greg Plaxton Theory in Programming Practice, Spring 2005 Department of Computer Science University of Texas at Austin Hamming Code: Motivation
More informationQuantum Error Correcting Codes and Quantum Cryptography. Peter Shor M.I.T. Cambridge, MA 02139
Quantum Error Correcting Codes and Quantum Cryptography Peter Shor M.I.T. Cambridge, MA 02139 1 We start out with two processes which are fundamentally quantum: superdense coding and teleportation. Superdense
More informationHow to build a brain. Cognitive Modelling. Terry & Chris Centre for Theoretical Neuroscience University of Waterloo
How to build a brain Cognitive Modelling Terry & Chris Centre for Theoretical Neuroscience University of Waterloo So far... We have learned how to implement high-dimensional vector representations linear
More informationImplementing Competitive Learning in a Quantum System
Implementing Competitive Learning in a Quantum System Dan Ventura fonix corporation dventura@fonix.com http://axon.cs.byu.edu/dan Abstract Ideas from quantum computation are applied to the field of neural
More informationRemark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 6 Eigenvalues and Eigenvectors Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called an eigenvalue of A if there is a nontrivial
More informationA Review of Matrix Analysis
Matrix Notation Part Matrix Operations Matrices are simply rectangular arrays of quantities Each quantity in the array is called an element of the matrix and an element can be either a numerical value
More informationInverses. Stephen Boyd. EE103 Stanford University. October 28, 2017
Inverses Stephen Boyd EE103 Stanford University October 28, 2017 Outline Left and right inverses Inverse Solving linear equations Examples Pseudo-inverse Left and right inverses 2 Left inverses a number
More informationLecture Notes 1: Vector spaces
Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector
More informationArtificial Neural Networks Lecture Notes Part 2
Artificial Neural Networks Lecture Notes Part 2 About this file: If you have trouble reading the contents of this file, or in case of transcription errors, email gi0062@bcmail.brooklyn.cuny.edu Acknowledgments:
More informationThis is a repository copy of Improving the associative rule chaining architecture.
This is a repository copy of Improving the associative rule chaining architecture. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/75674/ Version: Accepted Version Book Section:
More informationRemark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.
Sec 5 Eigenvectors and Eigenvalues In this chapter, vector means column vector Definition An eigenvector of an n n matrix A is a nonzero vector x such that A x λ x for some scalar λ A scalar λ is called
More informationQuantum Geometric Algebra
ANPA 22: Quantum Geometric Algebra Quantum Geometric Algebra ANPA Conference Cambridge, UK by Dr. Douglas J. Matzke matzke@ieee.org Aug 15-18, 22 8/15/22 DJM ANPA 22: Quantum Geometric Algebra Abstract
More informationChannel Coding and Interleaving
Lecture 6 Channel Coding and Interleaving 1 LORA: Future by Lund www.futurebylund.se The network will be free for those who want to try their products, services and solutions in a precommercial stage.
More informationA Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag
A Tutorial on Data Reduction Principal Component Analysis Theoretical Discussion By Shireen Elhabian and Aly Farag University of Louisville, CVIP Lab November 2008 PCA PCA is A backbone of modern data
More informationLinear Algebra and Eigenproblems
Appendix A A Linear Algebra and Eigenproblems A working knowledge of linear algebra is key to understanding many of the issues raised in this work. In particular, many of the discussions of the details
More informationNumerical Methods Lecture 2 Simultaneous Equations
Numerical Methods Lecture 2 Simultaneous Equations Topics: matrix operations solving systems of equations pages 58-62 are a repeat of matrix notes. New material begins on page 63. Matrix operations: Mathcad
More informationDecomposition algorithms in Clifford algebras
Decomposition algorithms in Clifford algebras G. Stacey Staples 1 Department of Mathematics and Statistics Southern Illinois University Edwardsville Combinatorics and Computer Algebra Fort Collins, 2015
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More informationAn Introduction to Low Density Parity Check (LDPC) Codes
An Introduction to Low Density Parity Check (LDPC) Codes Jian Sun jian@csee.wvu.edu Wireless Communication Research Laboratory Lane Dept. of Comp. Sci. and Elec. Engr. West Virginia University June 3,
More informationChapter 2. Error Correcting Codes. 2.1 Basic Notions
Chapter 2 Error Correcting Codes The identification number schemes we discussed in the previous chapter give us the ability to determine if an error has been made in recording or transmitting information.
More informationSGD and Deep Learning
SGD and Deep Learning Subgradients Lets make the gradient cheating more formal. Recall that the gradient is the slope of the tangent. f(w 1 )+rf(w 1 ) (w w 1 ) Non differentiable case? w 1 Subgradients
More informationCommunications II Lecture 9: Error Correction Coding. Professor Kin K. Leung EEE and Computing Departments Imperial College London Copyright reserved
Communications II Lecture 9: Error Correction Coding Professor Kin K. Leung EEE and Computing Departments Imperial College London Copyright reserved Outline Introduction Linear block codes Decoding Hamming
More informationAssociative Memory : Soft Computing Course Lecture 21 24, notes, slides RC Chakraborty, Aug.
Associative Memory : Soft Computing Course Lecture 21 24, notes, slides www.myreaders.info/, RC Chakraborty, e-mail rcchak@gmail.com, Aug. 10, 2010 http://www.myreaders.info/html/soft_computing.html www.myreaders.info
More informationAnd for polynomials with coefficients in F 2 = Z/2 Euclidean algorithm for gcd s Concept of equality mod M(x) Extended Euclid for inverses mod M(x)
Outline Recall: For integers Euclidean algorithm for finding gcd s Extended Euclid for finding multiplicative inverses Extended Euclid for computing Sun-Ze Test for primitive roots And for polynomials
More informationCS Topics in Cryptography January 28, Lecture 5
CS 4501-6501 Topics in Cryptography January 28, 2015 Lecture 5 Lecturer: Mohammad Mahmoody Scribe: Ameer Mohammed 1 Learning with Errors: Motivation An important goal in cryptography is to find problems
More informationArrangements, matroids and codes
Arrangements, matroids and codes first lecture Ruud Pellikaan joint work with Relinde Jurrius ACAGM summer school Leuven Belgium, 18 July 2011 References 2/43 1. Codes, arrangements and matroids by Relinde
More informationarxiv: v1 [cs.ai] 10 Jun 2010
The Pet-Fish problem on the World-Wide Web Diederik Aerts Center Leo Apostel for Interdisciplinary Studies Vrije Universiteit Brussel, 1160 Brussels, Belgium Email: diraerts@vub.ac.be arxiv:1006.1930v1
More informationCoset Decomposition Method for Decoding Linear Codes
International Journal of Algebra, Vol. 5, 2011, no. 28, 1395-1404 Coset Decomposition Method for Decoding Linear Codes Mohamed Sayed Faculty of Computer Studies Arab Open University P.O. Box: 830 Ardeya
More informationMath (P)Review Part II:
Math (P)Review Part II: Vector Calculus Computer Graphics Assignment 0.5 (Out today!) Same story as last homework; second part on vector calculus. Slightly fewer questions Last Time: Linear Algebra Touched
More informationUsing a Hopfield Network: A Nuts and Bolts Approach
Using a Hopfield Network: A Nuts and Bolts Approach November 4, 2013 Gershon Wolfe, Ph.D. Hopfield Model as Applied to Classification Hopfield network Training the network Updating nodes Sequencing of
More informationAlgebraic Codes for Error Control
little -at- mathcs -dot- holycross -dot- edu Department of Mathematics and Computer Science College of the Holy Cross SACNAS National Conference An Abstract Look at Algebra October 16, 2009 Outline Coding
More informationMATRICES. a m,1 a m,n A =
MATRICES Matrices are rectangular arrays of real or complex numbers With them, we define arithmetic operations that are generalizations of those for real and complex numbers The general form a matrix of
More informationPositive semidefinite matrix approximation with a trace constraint
Positive semidefinite matrix approximation with a trace constraint Kouhei Harada August 8, 208 We propose an efficient algorithm to solve positive a semidefinite matrix approximation problem with a trace
More informationPersistent Homology. 128 VI Persistence
8 VI Persistence VI. Persistent Homology A main purpose of persistent homology is the measurement of the scale or resolution of a topological feature. There are two ingredients, one geometric, assigning
More informationLecture 3: Error Correcting Codes
CS 880: Pseudorandomness and Derandomization 1/30/2013 Lecture 3: Error Correcting Codes Instructors: Holger Dell and Dieter van Melkebeek Scribe: Xi Wu In this lecture we review some background on error
More informationVectors and Matrices Statistics with Vectors and Matrices
Vectors and Matrices Statistics with Vectors and Matrices Lecture 3 September 7, 005 Analysis Lecture #3-9/7/005 Slide 1 of 55 Today s Lecture Vectors and Matrices (Supplement A - augmented with SAS proc
More informationEXTENDED FUZZY COGNITIVE MAPS
EXTENDED FUZZY COGNITIVE MAPS Masafumi Hagiwara Department of Psychology Stanford University Stanford, CA 930 U.S.A. Abstract Fuzzy Cognitive Maps (FCMs) have been proposed to represent causal reasoning
More informationDot Products, Transposes, and Orthogonal Projections
Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +
More informationNeural Turing Machine. Author: Alex Graves, Greg Wayne, Ivo Danihelka Presented By: Tinghui Wang (Steve)
Neural Turing Machine Author: Alex Graves, Greg Wayne, Ivo Danihelka Presented By: Tinghui Wang (Steve) Introduction Neural Turning Machine: Couple a Neural Network with external memory resources The combined
More informationCSCI 252: Neural Networks and Graphical Models. Fall Term 2016 Prof. Levy. Architecture #7: The Simple Recurrent Network (Elman 1990)
CSCI 252: Neural Networks and Graphical Models Fall Term 2016 Prof. Levy Architecture #7: The Simple Recurrent Network (Elman 1990) Part I Multi-layer Neural Nets Taking Stock: What can we do with neural
More informationAn overview of key ideas
An overview of key ideas This is an overview of linear algebra given at the start of a course on the mathematics of engineering. Linear algebra progresses from vectors to matrices to subspaces. Vectors
More informationLecture 4: Linear Algebra 1
Lecture 4: Linear Algebra 1 Sourendu Gupta TIFR Graduate School Computational Physics 1 February 12, 2010 c : Sourendu Gupta (TIFR) Lecture 4: Linear Algebra 1 CP 1 1 / 26 Outline 1 Linear problems Motivation
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationCS286.2 Lecture 8: A variant of QPCP for multiplayer entangled games
CS286.2 Lecture 8: A variant of QPCP for multiplayer entangled games Scribe: Zeyu Guo In the first lecture, we saw three equivalent variants of the classical PCP theorems in terms of CSP, proof checking,
More information2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net.
2- AUTOASSOCIATIVE NET - The feedforward autoassociative net considered in this section is a special case of the heteroassociative net. - For an autoassociative net, the training input and target output
More informationOutline. policies for the first part. with some potential answers... MCS 260 Lecture 10.0 Introduction to Computer Science Jan Verschelde, 9 July 2014
Outline 1 midterm exam on Friday 11 July 2014 policies for the first part 2 questions with some potential answers... MCS 260 Lecture 10.0 Introduction to Computer Science Jan Verschelde, 9 July 2014 Intro
More informationThe Logic of Geometric Proof
The Logic of Geometric Proof Ron Rood Department of Philosophy, Vrije Universiteit Amsterdam ron.rood@planet.nl Logical studies of diagrammatic reasoning indeed, mathematical reasoning in general are typically
More informationLinear Algebra V = T = ( 4 3 ).
Linear Algebra Vectors A column vector is a list of numbers stored vertically The dimension of a column vector is the number of values in the vector W is a -dimensional column vector and V is a 5-dimensional
More informationWolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig
Multimedia Databases Wolf-Tilo Balke Silviu Homoceanu Institut für Informationssysteme Technische Universität Braunschweig http://www.ifis.cs.tu-bs.de 14 Indexes for Multimedia Data 14 Indexes for Multimedia
More informationLearning and Memory in Neural Networks
Learning and Memory in Neural Networks Guy Billings, Neuroinformatics Doctoral Training Centre, The School of Informatics, The University of Edinburgh, UK. Neural networks consist of computational units
More informationDeterminants and Scalar Multiplication
Invertibility and Properties of Determinants In a previous section, we saw that the trace function, which calculates the sum of the diagonal entries of a square matrix, interacts nicely with the operations
More informationSome Introductory Notes on Quantum Computing
Some Introductory Notes on Quantum Computing Markus G. Kuhn http://www.cl.cam.ac.uk/~mgk25/ Computer Laboratory University of Cambridge 2000-04-07 1 Quantum Computing Notation Quantum Computing is best
More informationLogistic Regression. Will Monroe CS 109. Lecture Notes #22 August 14, 2017
1 Will Monroe CS 109 Logistic Regression Lecture Notes #22 August 14, 2017 Based on a chapter by Chris Piech Logistic regression is a classification algorithm1 that works by trying to learn a function
More informationI&C 6N. Computational Linear Algebra
I&C 6N Computational Linear Algebra 1 Lecture 1: Scalars and Vectors What is a scalar? Computer representation of a scalar Scalar Equality Scalar Operations Addition and Multiplication What is a vector?
More informationThe Ordinal Serial Encoding Model: Serial Memory in Spiking Neurons
The Ordinal Serial Encoding Model: Serial Memory in Spiking Neurons by Feng-Xuan Choo A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master
More informationLecture 1: Systems of linear equations and their solutions
Lecture 1: Systems of linear equations and their solutions Course overview Topics to be covered this semester: Systems of linear equations and Gaussian elimination: Solving linear equations and applications
More informationLinear Models Review
Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign
More informationWed Feb The vector spaces 2, 3, n. Announcements: Warm-up Exercise:
Wed Feb 2 4-42 The vector spaces 2, 3, n Announcements: Warm-up Exercise: 4-42 The vector space m and its subspaces; concepts related to "linear combinations of vectors" Geometric interpretation of vectors
More informationDiscriminative Direction for Kernel Classifiers
Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering
More informationA Coupled Helmholtz Machine for PCA
A Coupled Helmholtz Machine for PCA Seungjin Choi Department of Computer Science Pohang University of Science and Technology San 3 Hyoja-dong, Nam-gu Pohang 79-784, Korea seungjin@postech.ac.kr August
More informationLarge Scale Data Analysis Using Deep Learning
Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset
More informationWe use the overhead arrow to denote a column vector, i.e., a number with a direction. For example, in three-space, we write
1 MATH FACTS 11 Vectors 111 Definition We use the overhead arrow to denote a column vector, ie, a number with a direction For example, in three-space, we write The elements of a vector have a graphical
More information1111: Linear Algebra I
1111: Linear Algebra I Dr. Vladimir Dotsenko (Vlad) Lecture 1 Dr. Vladimir Dotsenko (Vlad) 1111: Linear Algebra I Lecture 1 1 / 12 House rules 3 lectures on all odd weeks, 2 lectures and one tutorial on
More informationJim Lambers MAT 610 Summer Session Lecture 2 Notes
Jim Lambers MAT 610 Summer Session 2009-10 Lecture 2 Notes These notes correspond to Sections 2.2-2.4 in the text. Vector Norms Given vectors x and y of length one, which are simply scalars x and y, the
More informationThe query register and working memory together form the accessible memory, denoted H A. Thus the state of the algorithm is described by a vector
1 Query model In the quantum query model we wish to compute some function f and we access the input through queries. The complexity of f is the number of queries needed to compute f on a worst-case input
More informationBlack Box Linear Algebra
Black Box Linear Algebra An Introduction to Wiedemann s Approach William J. Turner Department of Mathematics & Computer Science Wabash College Symbolic Computation Sometimes called Computer Algebra Symbols
More informationVector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.
Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar
More informationProblem Set # 1 Solution, 18.06
Problem Set # 1 Solution, 1.06 For grading: Each problem worths 10 points, and there is points of extra credit in problem. The total maximum is 100. 1. (10pts) In Lecture 1, Prof. Strang drew the cone
More informationEntropy in Classical and Quantum Information Theory
Entropy in Classical and Quantum Information Theory William Fedus Physics Department, University of California, San Diego. Entropy is a central concept in both classical and quantum information theory,
More informationData Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs
Data Analysis and Manifold Learning Lecture 9: Diffusion on Manifolds and on Graphs Radu Horaud INRIA Grenoble Rhone-Alpes, France Radu.Horaud@inrialpes.fr http://perception.inrialpes.fr/ Outline of Lecture
More informationMemories Associated with Single Neurons and Proximity Matrices
Memories Associated with Single Neurons and Proximity Matrices Subhash Kak Oklahoma State University, Stillwater Abstract: This paper extends the treatment of single-neuron memories obtained by the use
More informationEuclidean Spaces. Euclidean Spaces. Chapter 10 -S&B
Chapter 10 -S&B The Real Line: every real number is represented by exactly one point on the line. The plane (i.e., consumption bundles): Pairs of numbers have a geometric representation Cartesian plane
More informationLinear Algebra for Machine Learning. Sargur N. Srihari
Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it
More informationComputational Foundations of Cognitive Science. Definition of the Inverse. Definition of the Inverse. Definition Computation Properties
al Foundations of Cognitive Science Lecture 1: s; Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk February, 010 1 Reading: Anton and Busby, Chs..,.6 Frank Keller al Foundations
More informationMa/CS 6b Class 24: Error Correcting Codes
Ma/CS 6b Class 24: Error Correcting Codes By Adam Sheffer Communicating Over a Noisy Channel Problem. We wish to transmit a message which is composed of 0 s and 1 s, but noise might accidentally flip some
More informationDefinition (T -invariant subspace) Example. Example
Eigenvalues, Eigenvectors, Similarity, and Diagonalization We now turn our attention to linear transformations of the form T : V V. To better understand the effect of T on the vector space V, we begin
More informationHints and Selected Answers to Homework Sets
Hints and Selected Answers to Homework Sets Phil R. Smith, Ph.D. March 8, 2 Test I: Systems of Linear Equations; Matrices Lesson. Here s a linear equation in terms of a, b, and c: a 5b + 7c 2 ( and here
More informationIterative Autoassociative Net: Bidirectional Associative Memory
POLYTECHNIC UNIVERSITY Department of Computer and Information Science Iterative Autoassociative Net: Bidirectional Associative Memory K. Ming Leung Abstract: Iterative associative neural networks are introduced.
More informationExtending the Associative Rule Chaining Architecture for Multiple Arity Rules
Extending the Associative Rule Chaining Architecture for Multiple Arity Rules Nathan Burles, James Austin, and Simon O Keefe Advanced Computer Architectures Group Department of Computer Science University
More informationAnomaly Detection. What is an anomaly?
Anomaly Detection Brian Palmer What is an anomaly? the normal behavior of a process is characterized by a model Deviations from the model are called anomalies. Example Applications versus spyware There
More informationComputational Foundations of Cognitive Science. Addition and Scalar Multiplication. Matrix Multiplication
Computational Foundations of Cognitive Science Lecture 1: Algebraic ; Transpose; Inner and Outer Product Frank Keller School of Informatics University of Edinburgh keller@inf.ed.ac.uk February 23, 21 1
More informationNumerical Methods Lecture 2 Simultaneous Equations
CGN 42 - Computer Methods Numerical Methods Lecture 2 Simultaneous Equations Topics: matrix operations solving systems of equations Matrix operations: Adding / subtracting Transpose Multiplication Adding
More informationMATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.
MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation
More informationVectors for Beginners
Vectors for Beginners Leo Dorst September 6, 2007 1 Three ways of looking at linear algebra We will always try to look at what we do in linear algebra at three levels: geometric: drawing a picture. This
More informationQuantum computing with superconducting qubits Towards useful applications
Quantum computing with superconducting qubits Towards useful applications Stefan Filipp IBM Research Zurich Switzerland Forum Teratec 2018 June 20, 2018 Palaiseau, France Why Quantum Computing? Why now?
More informationDesigning Information Devices and Systems I Fall 2018 Lecture Notes Note Introduction to Linear Algebra the EECS Way
EECS 16A Designing Information Devices and Systems I Fall 018 Lecture Notes Note 1 1.1 Introduction to Linear Algebra the EECS Way In this note, we will teach the basics of linear algebra and relate it
More informationLecture 1: Review of linear algebra
Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations
More information