Artificial Neural Network 简明教程

Artificial Neural Network - Basic Concepts

神经网络是并行计算设备,它基本上是尝试建立大脑的计算机模型。主要目标是开发一个系统,以比传统系统更快地执行各种计算任务。这些任务包括模式识别和分类、逼近、优化和数据聚类。

Neural networks are parallel computing devices, which is basically an attempt to make a computer model of the brain. The main objective is to develop a system to perform various computational tasks faster than the traditional systems. These tasks include pattern recognition and classification, approximation, optimization, and data clustering.

What is Artificial Neural Network?

人造神经网络 (ANN) 是一个有效的计算系统,其中心思想借鉴了生物神经网络的类比。ANN 也被称为“人造神经系统”或“平行分布式处理系统”或“连接主义系统”。ANN 拥有大量单元的集合,这些单元以某种模式相互连接,以允许单元之间进行通信。这些单元,也称为节点或神经元,是并行工作的简单处理器。

Artificial Neural Network (ANN) is an efficient computing system whose central theme is borrowed from the analogy of biological neural networks. ANNs are also named as “artificial neural systems,” or “parallel distributed processing systems,” or “connectionist systems.” ANN acquires a large collection of units that are interconnected in some pattern to allow communication between the units. These units, also referred to as nodes or neurons, are simple processors which operate in parallel.

每个神经元都通过连接链路与其他神经元连接。每个连接链路都与一个权重相关,该权重包含输入信号的信息。这是神经元解决特定问题最有用的信息,因为权重通常会激发或抑制正在传达的信号。每个神经元都有一个内部状态,称为激活信号。在组合输入信号和激活规则后产生的输出信号可以发送到其他单元。

Every neuron is connected with other neuron through a connection link. Each connection link is associated with a weight that has information about the input signal. This is the most useful information for neurons to solve a particular problem because the weight usually excites or inhibits the signal that is being communicated. Each neuron has an internal state, which is called an activation signal. Output signals, which are produced after combining the input signals and activation rule, may be sent to other units.

A Brief History of ANN

ANN 的历史可以分为以下三个时代 −

The history of ANN can be divided into the following three eras −

ANN during 1940s to 1960s

这个时代的一些关键发展如下 −

Some key developments of this era are as follows −

  1. 1943 − It has been assumed that the concept of neural network started with the work of physiologist, Warren McCulloch, and mathematician, Walter Pitts, when in 1943 they modeled a simple neural network using electrical circuits in order to describe how neurons in the brain might work.

  2. 1949 − Donald Hebb’s book, The Organization of Behavior, put forth the fact that repeated activation of one neuron by another increases its strength each time they are used.

  3. 1956 − An associative memory network was introduced by Taylor.

  4. 1958 − A learning method for McCulloch and Pitts neuron model named Perceptron was invented by Rosenblatt.

  5. 1960 − Bernard Widrow and Marcian Hoff developed models called "ADALINE" and “MADALINE.”

ANN during 1960s to 1980s

这个时代的一些关键发展如下 −

Some key developments of this era are as follows −

  1. 1961 − Rosenblatt made an unsuccessful attempt but proposed the “backpropagation” scheme for multilayer networks.

  2. 1964 − Taylor constructed a winner-take-all circuit with inhibitions among output units.

  3. 1969 − Multilayer perceptron (MLP) was invented by Minsky and Papert.

  4. 1971 − Kohonen developed Associative memories.

  5. 1976 − Stephen Grossberg and Gail Carpenter developed Adaptive resonance theory.

ANN from 1980s till Present

这个时代的一些关键发展如下 −

Some key developments of this era are as follows −

  1. 1982 − The major development was Hopfield’s Energy approach.

  2. 1985 − Boltzmann machine was developed by Ackley, Hinton, and Sejnowski.

  3. 1986 − Rumelhart, Hinton, and Williams introduced Generalised Delta Rule.

  4. 1988 − Kosko developed Binary Associative Memory (BAM) and also gave the concept of Fuzzy Logic in ANN.

历史回顾表明,该领域已取得了重大进展。基于神经网络的芯片正在出现,并且正在开发对复杂问题的应用。当然,今天是神经网络技术转型期。

The historical review shows that significant progress has been made in this field. Neural network based chips are emerging and applications to complex problems are being developed. Surely, today is a period of transition for neural network technology.

Biological Neuron

神经细胞(神经元)是一种特殊的生物细胞,可以处理信息。据估计,有大量的,大约1011个神经元以及大量的相互连接,大约1015个。

A nerve cell (neuron) is a special biological cell that processes information. According to an estimation, there are huge number of neurons, approximately 1011 with numerous interconnections, approximately 1015.

Schematic Diagram

schematic diagram

Working of a Biological Neuron

如图所示,一个典型的神经元由以下四个部分组成,借助这四个部分我们可以解释它的工作原理 −

As shown in the above diagram, a typical neuron consists of the following four parts with the help of which we can explain its working −

  1. Dendrites − They are tree-like branches, responsible for receiving the information from other neurons it is connected to. In other sense, we can say that they are like the ears of neuron.

  2. Soma − It is the cell body of the neuron and is responsible for processing of information, they have received from dendrites.

  3. Axon − It is just like a cable through which neurons send the information.

  4. Synapses − It is the connection between the axon and other neuron dendrites.

ANN versus BNN

在查看人工神经网络 (ANN) 和生物神经网络 (BNN) 之间的差异之前,让我们来看看这两个网络在术语方面的相似之处。

Before taking a look at the differences between Artificial Neural Network (ANN) and Biological Neural Network (BNN), let us take a look at the similarities based on the terminology between these two.

Biological Neural Network (BNN)

Artificial Neural Network (ANN)

Soma

Node

Dendrites

Input

Synapse

Weights or Interconnections

Axon

Output

下表显示了根据一些所述标准对 ANN 和 BNN 之间的比较。

The following table shows the comparison between ANN and BNN based on some criteria mentioned.

Criteria

BNN

ANN

Processing

Massively parallel, slow but superior than ANN

Massively parallel, fast but inferior than BNN

Size

1011 neurons and 1015 interconnections

102 to 104 nodes (mainly depends on the type of application and network designer)

Learning

They can tolerate ambiguity

Very precise, structured and formatted data is required to tolerate ambiguity

Fault tolerance

Performance degrades with even partial damage

It is capable of robust performance, hence has the potential to be fault tolerant

Storage capacity

Stores the information in the synapse

Stores the information in continuous memory locations

Model of Artificial Neural Network

下图表示 ANN 的通用模型,后面紧跟着其处理过程。

The following diagram represents the general model of ANN followed by its processing.

model

对于人工神经网络的上述通用模型,净输入可以计算如下 -

For the above general model of artificial neural network, the net input can be calculated as follows −

y_in:x_1.w_1 :x_2.w_2 :x_3.w_3 :x_m.w_m

y_{in}\:=\:x_{1}.w_{1}\:+\:x_{2}.w_{2}\:+\:x_{3}.w_{3}\:\dotso\: x_{m}.w_{m}

即净输入y_in:∑_i^m:x_i . w_i

i.e., Net input $y_{in}\:=\:\sum_i^m\:x_{i}.w_{i}$

输出可以通过对净输入应用激活函数来计算。

The output can be calculated by applying the activation function over the net input.

Y:F(y_in)

Y\:=\:F(y_{in})

输出 = 函数(计算出的净输入)

Output = function (net input calculated)