UNIT II ASSOCIATIVE AND
UNSUPERVISEDLEARNING NETWORKS
Training Algorithms for Pattern Association-
Autoassociative Memory Network-Heteroassociative
Memory Network- Bidirectional Associative Memory (BAM)-
Hopfield Networks-Iterative Autoassociative Memory
Networks-Temporal Associative Memory Network-Fixed
Weight Competitive Nets-Kohonen Self-Organizing Feature
Maps-Learning Vector Quantization-Counter propagation
Networks-Adaptive Resonance Theory Network
Topic I : Training Algorithms for Pattern
Association
Associative Memory Networks
An associative memory network can store a set of patterns as
memories.
When the associative memory is being presented with a key
pattern, it responds by producing one of the stored patterns,
which closely resembles or relates to the key pattern.
Thus, the recall is through association of the key pattern, with
the help of information memorized.
These types of memories are also called as content-
addressable memories (CAM).
The CAM can also be viewed as associating data to address,
i.e.; for every data in the memory there is a corresponding
unique address.
Associative Memory Networks
Here input data is correlated with that of the stored data in
the CAM.
It should be noted that the stored patterns must be unique,
i.e., different patterns in each location.
If the same pattern exists in more than one location in the
CAM, then, even though the correlation is correct, the
address is noted to be ambiguous.
Associative memory makes a parallel search within a
stored data file.
The concept behind this search is to Output any one or all
stored items Which match the given search argument.
Training Algorithms for Pattern
Association
There are two algorithms developed for training of
pattern association nets.
Hebb Rule
Outer Products Rule
1. Hebb Rule
The Hebb rule is widely used for finding the
weights of an associative memory neural network.
The training vector pairs here are denoted as s:t.
The weights are updated until there is no weight
change.
Hebb Rule Algorithm
Training Algorithms for Pattern Association
Step 0: Set all the initial weights to zero, i.e.,
Wij = 0 (i = 1 to n, j = 1 to m)
Step 1: For each training target input output
vector pairs s:t,
perform Steps 2-4.
Step 2: Activate the input layer units to current
training input,
Xi=Si (for i = 1 to n)
Step 3: Activate the output layer units to current
target output,
yj = tj (for j = 1 to m)
2. Outer Products Rule
Outer products rule is a method for finding weights of
an associative net.
Input=> s = (s1, ... ,si, ... ,sn)
Output=> t= (t1, ... ,tj, ... ,tm)
The outer product of the two vectors is the product of
the matrices
S = sT and T = t, i.e., between [n X 1] matrix and
[1 x m] matrix.
The transpose is to be taken for the input matrix
given.
2. Outer Products Rule
ST = sTt => [s1..si..sn]*[t1..tj..tm]
This weight matrix is same as the weight matrix
obtained by Hebb rule to store the pattern
association s:t.
For storing a set of associations, s(p):t(p), p = 1 to P,
wherein,
s(p) = (s1 (p), ... , si(p), ... , sn(p))
t(p) = (t1 (p), · · ·' tj(p), · · · 'tm(p))
the weight matrix W = {wij} can be given as
There two types of associative memories
Auto Associative Memory
Hetero Associative memory
Auto Associative Memory
It recovers a previously stored pattern that most
closely relates to the current pattern.
It is also known as an auto-associative
correlator.
Here, the input vector and output vector are
the same.
Auto Associative Memory
Auto Associative Memory Algorithm - Training
Algorithm
It uses the Hebb or Delta learning rule.
Step 1 − Initialize all the weights to zero as
wij = 0 ( i=1 to n, j=1 to n )
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows : xi=si ( i=1 to
n)
Step 4 − Activate each output unit as follows : yj=sj
( j=1 to n)
Step 5 − Adjust the weights as follows
wij(new)=wij(old) + xiyj
The weight can also be determine form the Hebb Rule or
Auto Associative Memory Algorithm -
Testing Algorithm
Step 1 − Set the weights obtained during training for Hebb’s
rule.
Step 2 − Perform steps 3-5 for each input vector.
Step 3 − Set the activation of the input units equal to that of
the input vector.
Step 4 − Calculate the net input to each output unit j = 1 to n;
Step 5 − Apply the following activation function to calculate
the output.
Hetero Associative memory
The training input and the target output vectors are
different.
The weights are determined in a way that the network can
store a set of pattern associations.
The association here is a pair of training input target output
vector pairs (s(p), t(p)), with p = 1,2,…p.
Each vector s(p) has ‘n’ components and each vector t(p) has
‘m’ components.
The determination of weights is done either by using Hebb
rule or delta rule.
The net finds an appropriate output vector, that corresponds
to an input vector x
It may be either one of the stored patterns or a new pattern.
Hetero Associative memory
Hetero Associative Memory Algorithm -Training
Algorithm
Step 1 − Initialize all the weights to zero as
wij = 0 i= 1 to n, j= 1 to m
Step 2 − Perform steps 3-4 for each input vector.
Step 3 − Activate each input unit as follows
xi=si(i=1 to n)
Step 4 − Activate each output unit as follows
yj=sj(j=1 to m)
Step 5 − Adjust the weights as follows
wij(new)=wij(old)+xiyj. The weight can
also be determined from the Hebb Rule or Outer Products
Rule learning.
Hetero Associative Memory Algorithm -Testing
Algorithm
Step 1 − Set the weights obtained during training
for Hebb’s rule.
Step 2 − Perform steps 3-5 for each input vector.
Step 3 − Set the activation of the input units
equal to that of the input vector.
Step 4 − Calculate the net input to each output
unit
j = 1 to m;
Step 5 − Apply the following activation function
to calculate the output
THANK