Logistic Regression In Python 简明教程
Logistic Regression in Python - Limitations
正如你从上面的示例中看到的,将逻辑回归应用于机器学习并不是一项困难的任务。但是,它有其自身的局限性。逻辑回归无法处理大量的分类特征。在迄今为止我们讨论的示例中,我们将特征数量减少到了很大程度。
As you have seen from the above example, applying logistic regression for machine learning is not a difficult task. However, it comes with its own limitations. The logistic regression will not be able to handle a large number of categorical features. In the example we have discussed so far, we reduced the number of features to a very large extent.
但是,如果这些特征对我们的预测很重要,我们会被迫包含它们,但这样的话逻辑回归将无法给予我们良好的精度。逻辑回归也容易过拟合。它不适用于非线性问题。对于与目标无关且相互关联的自变量,它的表现会很差。因此,你必须仔细评估逻辑回归对你试图解决的问题的适用性。
However, if these features were important in our prediction, we would have been forced to include them, but then the logistic regression would fail to give us a good accuracy. Logistic regression is also vulnerable to overfitting. It cannot be applied to a non-linear problem. It will perform poorly with independent variables which are not correlated to the target and are correlated to each other. Thus, you will have to carefully evaluate the suitability of logistic regression to the problem that you are trying to solve.
机器学习的许多领域都指定了其他技术。举几个例子,我们有诸如 k 最近邻 (kNN)、线性回归、支持向量机 (SVM)、决策树、朴素贝叶斯等等算法。在最终确定某个特定模型之前,你必须评估这些各种技术对我们试图解决的问题的适用性。
There are many areas of machine learning where other techniques are specified devised. To name a few, we have algorithms such as k-nearest neighbours (kNN), Linear Regression, Support Vector Machines (SVM), Decision Trees, Naive Bayes, and so on. Before finalizing on a particular model, you will have to evaluate the applicability of these various techniques to the problem that we are trying to solve.