Machine Learning 简明教程

Machine Learning - Stochastic Gradient Descent

梯度下降是一种流行的优化算法,用于最小化机器学习模型的成本函数。它的工作原理是对模型参数进行迭代调整,以最小化预测输出和实际输出之间的差异。该算法通过计算成本函数关于模型参数的梯度,然后按照梯度的相反方向调整参数来工作。

Gradient Descent is a popular optimization algorithm that is used to minimize the cost function of a machine learning model. It works by iteratively adjusting the model parameters to minimize the difference between the predicted output and the actual output. The algorithm works by calculating the gradient of the cost function with respect to the model parameters and then adjusting the parameters in the opposite direction of the gradient.

随机梯度下降是梯度下降的一种变体,它会为每个训练示例更新参数,而不是在评估整个数据集后更新参数。这意味着 SGD 不会使用整个数据集来计算成本函数的梯度,而是只使用单个训练示例。这种方法允许算法更快地收敛,并且只需要更少的内存来存储数据。

Stochastic Gradient Descent is a variant of Gradient Descent that updates the parameters for each training example instead of updating them after evaluating the entire dataset. This means that instead of using the entire dataset to calculate the gradient of the cost function, SGD only uses a single training example. This approach allows the algorithm to converge faster and requires less memory to store the data.

Working of Stochastic Gradient Descent Algorithm

随机梯度下降的工作原理是从数据集中随机选择单个训练示例并使用它更新模型参数。这个过程会重复进行一定次数的 epoch,或者直到模型收敛到成本函数的最小值为止。

Stochastic Gradient Descent works by randomly selecting a single training example from the dataset and using it to update the model parameters. This process is repeated for a fixed number of epochs, or until the model converges to a minimum of the cost function.

以下是随机梯度下降算法的工作方式:

Here’s how the Stochastic Gradient Descent algorithm works −

  1. Initialize the model parameters to random values.

  2. For each epoch, randomly shuffle the training data.

  3. For each training example − Calculate the gradient of the cost function with respect to the model parameters. Update the model parameters in the opposite direction of the gradient.

  4. Repeat until convergence

随机梯度下降和常规梯度下降之间的主要区别是计算梯度的方式和更新模型参数的方式。在随机梯度下降中,使用单个训练示例计算梯度,而在梯度下降中,使用整个数据集计算梯度。

The main difference between Stochastic Gradient Descent and regular Gradient Descent is the way that the gradient is calculated and the way that the model parameters are updated. In Stochastic Gradient Descent, the gradient is calculated using a single training example, while in Gradient Descent, the gradient is calculated using the entire dataset.

Implementation of Stochastic Gradient Descent in Python

我们来看一个如何在 Python 中实现随机梯度下降的示例。我们将使用 scikit-learn 库在 Iris 数据集上实现该算法,Iris 数据集是一个用于分类任务的流行数据集。在此示例中,我们将使用萼片宽度和萼片长度这两个特征来预测鸢尾花属花卉种类:

Let’s look at an example of how to implement Stochastic Gradient Descent in Python. We will use the scikit-learn library to implement the algorithm on the Iris dataset which is a popular dataset used for classification tasks. In this example we will be predicting Iris flower species using its two features namely sepal width and sepal length −

Example

# Import required libraries
import sklearn

import numpy as np
from sklearn import datasets
from sklearn.linear_model import SGDClassifier

# Loading Iris flower dataset
iris = datasets.load_iris()
X_data, y_data = iris.data, iris.target

# Dividing the dataset into training and testing dataset
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler

# Getting the Iris dataset with only the first two attributes
X, y = X_data[:,:2], y_data

# Split the dataset into a training and a testing set(20 percent)
X_train, X_test, y_train, y_test = train_test_split(X, y,
test_size=0.20, random_state=1)

# Standarize the features
scaler = StandardScaler().fit(X_train)
X_train = scaler.transform(X_train)
X_test = scaler.transform(X_test)

# create the linear model SGDclassifier
clfmodel_SGD = SGDClassifier(alpha=0.001, max_iter=200)

# Train the classifier using fit() function
clfmodel_SGD.fit(X_train, y_train)

# Evaluate the result
from sklearn import metrics
y_train_pred = clfmodel_SGD.predict(X_train)
print ("\nThe Accuracy of SGD classifier is:",
metrics.accuracy_score(y_train, y_train_pred)*100)

当你运行这段代码时,它将产生以下输出:

When you run this code, it will produce the following output −

The Accuracy of SGD classifier is: 77.5