Artificial Neural Network 简明教程
Associate Memory Network
以下是我们可以观察到的两种类型的联想式内存 −
These kinds of neural networks work on the basis of pattern association, which means they can store different patterns and at the time of giving an output they can produce one of the stored patterns by matching them with the given input pattern. These types of memories are also called Content-Addressable Memory (CAM). Associative memory makes a parallel search with the stored patterns as data files.
这是一个单层神经网络,其中输入训练向量和输出目标向量是相同的。权重被确定下来,以便网络存储一组模式。
Following are the two types of associative memories we can observe −
-
Auto Associative Memory
-
Hetero Associative memory
Auto Associative Memory
如下图所示,自动联想式内存网络的结构具有 ‘n’ 个输入训练向量和类似的 ‘n’ 个输出目标向量。
This is a single layer neural network in which the input training vector and the output target vectors are the same. The weights are determined so that the network stores a set of patterns.
Architecture
为了进行训练,该网络正在使用希布尔或德尔塔学习规则。
As shown in the following figure, the architecture of Auto Associative memory network has ‘n’ number of input training vectors and similar ‘n’ number of output target vectors.
Training Algorithm
Step 1 − 将所有权重初始化为零,如 wij = 0 (i = 1 to n, j = 1 to n)
For training, this network is using the Hebb or Delta learning rule.
Step 2 − 对每个输入向量执行步骤 3-4。
Step 1 − Initialize all the weights to zero as wij = 0 (i = 1 to n, j = 1 to n)
Step 3 − 激活每个输入单元,如下所示 −
Step 2 − Perform steps 3-4 for each input vector.
x_{i}\:=\:a_{i}
Step 3 − Activate each input unit as follows −
x_{i}\:=\:s_{i}\:(i\:=\:1\:to\:n)
Step 4 − 激活每个输出单元,如下所示 −
Step 4 − Activate each output unit as follows −
y_{j}\:=\:s_{j}\:(j\:=\:1\:to\:n)
Step 5 − 调整权重,如下所示 −
Step 5 − Adjust the weights as follows −
w_{ij}(new)\:=\:w_{ij}(old)\:+\:x_{i}y_{j}
Testing Algorithm
Step 1 − 为希布尔规则设置在训练期间获得的权重。
Step 1 − Set the weights obtained during training for Hebb’s rule.
Step 2 − 对每个输入向量执行步骤 3-5。
Step 2 − Perform steps 3-5 for each input vector.
Step 3 − 将输入单元的激活设置为等于输入向量的激活。
Step 3 − Set the activation of the input units equal to that of the input vector.
Step 4 - 为每个输出单元计算净输入 j = 1 to n
Step 4 − Calculate the net input to each output unit j = 1 to n
y_{inj}\:=\:\displaystyle\sum\limits_{i=1}^n x_{i}w_{ij}
Step 5 - 应用以下激活函数来计算输出
Step 5 − Apply the following activation function to calculate the output
y_{j}\:=\:f(y_{inj})\:=\:\begin{cases}+1 & if\:y_{inj}\:>\:0\\-1 & if\:y_{inj}\:\leqslant\:0\end{cases}
Hetero Associative memory
类似于自动关联记忆网络,这也是一个单层神经网络。然而,在这个网络中,输入训练向量和输出目标向量并不相同。权重被确定下来,以使网络存储一组模式。异质关联网络本质上是静态的,因此不会有非线性和延迟操作。
Similar to Auto Associative Memory network, this is also a single layer neural network. However, in this network the input training vector and the output target vectors are not the same. The weights are determined so that the network stores a set of patterns. Hetero associative network is static in nature, hence, there would be no non-linear and delay operations.
Architecture
如下图所示,异质关联存储网络的架构具有 ‘n’ 个输入训练向量和 ‘m’ 个输出目标向量。
As shown in the following figure, the architecture of Hetero Associative Memory network has ‘n’ number of input training vectors and ‘m’ number of output target vectors.
Training Algorithm
Step 1 − 将所有权重初始化为零,如 wij = 0 (i = 1 to n, j = 1 to n)
For training, this network is using the Hebb or Delta learning rule.
Step 1 - 将所有权重初始化为零,即 wij = 0 (i = 1 to n, j = 1 to m)
Step 1 − Initialize all the weights to zero as wij = 0 (i = 1 to n, j = 1 to m)
Step 3 − 激活每个输入单元,如下所示 −
Step 2 − Perform steps 3-4 for each input vector.
x_{i}\:=\:a_{i}
Step 3 − Activate each input unit as follows −
x_{i}\:=\:s_{i}\:(i\:=\:1\:to\:n)
Step 4 − 激活每个输出单元,如下所示 −
Step 4 − Activate each output unit as follows −
y_{j}\:=\:s_{j}\:(j\:=\:1\:to\:m)
Step 5 − 调整权重,如下所示 −
Step 5 − Adjust the weights as follows −
w_{ij}(new)\:=\:w_{ij}(old)\:+\:x_{i}y_{j}
Testing Algorithm
Step 1 − 为希布尔规则设置在训练期间获得的权重。
Step 1 − Set the weights obtained during training for Hebb’s rule.
Step 2 − 对每个输入向量执行步骤 3-5。
Step 2 − Perform steps 3-5 for each input vector.
Step 3 − 将输入单元的激活设置为等于输入向量的激活。
Step 3 − Set the activation of the input units equal to that of the input vector.
Step 4 - 为每个输出单元计算净输入 j = 1 to m;
Step 4 − Calculate the net input to each output unit j = 1 to m;
y_{inj}\:=\:\displaystyle\sum\limits_{i=1}^n x_{i}w_{ij}
Step 5 - 应用以下激活函数来计算输出
Step 5 − Apply the following activation function to calculate the output
y_{j}\:=\:f(y_{inj})\:=\:\begin{cases}+1 & if\:y_{inj}\:>\:0\\0 & if\:y_{inj}\:=\:0\\-1 & if\:y_{inj}\:<\:0\end{cases}